• 0 Posts
  • 1.05K Comments
Joined 5 months ago
cake
Cake day: December 6th, 2024

help-circle


  • Well, this being the Internet it’s natural to expect less than impeccable truth from strangers here, both because a lot of people just want to feel like they “won” the argument no matter what so they’ll bullshit their way into a “win”, because most people aren’t really trained in the “trying to be as completed and clear as possible” mental processes as Engineers and Scientists (so there’s a lot of “I think this might be such” being passed as “it is such”) and because it simply feels bad to be wrong so most people don’t want to accept it when somebody else proves them wrong and react badly to it.

    I’m actually a trained Electronics Engineer but since I don’t actually work in that domain and studied it decades ago, some of what I wrote are informed extrapolations based on what learned and stuff I read over the years rather than me being absolutely certain that’s how things are done nowadays (which is why looking up and reading that Intel spec was very interesting, even if it turned out things are mainly is as I expected).

    Also I’m sorry for triggering you, you don’t need to say sorry for your reaction and I didn’t really took it badly: as I said, this is the Internet and a lot of people are argumentative for the sake of “winning” (probably the same motivation as most gaslighters) so I expect everybody to be suspicious of my motivations, same as they would be for all other people since from their point of view I’m just another random stranger ;)

    Anyways, cheers for taking the trouble of explaining it and making sure I was okay with out interaction - that’s far nicer and more considerate than most random internet strangers.



  • From what I’ve heard the extreme racism is generalized in the Mainstream Press in Israel, so I’m not sure that it’s the Social Media that’s making Genocide along ethnic lines be more broadly accepted there than it was in Nazi Germany.

    That said, the fables in Israel around “The Jewish People” and its identity with the Palestinians as “antagonists”, have been spun for far longer and the antagonism has been going on for far longer (since the creation of the state of Israel) than the ones in Nazi Germany with Jews as “antagonist”, so maybe the much longer time (about 70 years) during which Israelis have been indoctrinated about their own ethnic superiority and the “antagonism” of Palestinians in Israel compared to the length of the similar indoctrination of the population in Nazi Germany, is the main cause for the more broad acceptance of Genocide amongst the populus in Israel than in Nazi Germany.

    If you’ve grown up hearing about how you’re part of a superior people (“God’s chosen people”) and Palestinians are The Enemy who hate you and want you dead, and your parents too grew up like that, you’re probably far, far, more likely to be an extreme racist that trully believes themselves to be inherently superior and that Palestinians as not really human like themselves - hence having zero empathy for the suffering of Palestinians and maybe even relishing in it - than somebody who has only been exposed to similar shit for a decade or two.


  • Well, I wasn’t sure if you meant that I did say that or if you just wanted an explanation, so I both clarified what I said and I gave an explanation to cover both possibilities :)

    I think the person I was replying to just got confused when they wrote “integrated memory” since as I explained when main memory is “integrated” in systems like these, that just means it’s soldered on the motherboard, something which really makes no difference in terms of architecture.

    There are processing units with integrated memory (pretty much all microcontrollers), which in means they come with their own RAM (generally both Flash Ram and SRAM) in the same integrated circuit package or even the same die, but that’s at the very opposite end of processing power of a PC or PS5 and the memory amounts involved tend to be very small (a few MB or less).

    As for the “integrated graphics” bit, that’s actually the part that matters when it comes to performance of systems with dedicate CPU and GPU memory vs systems with shared memory (integrated in the motherboard or otherwise, since being soldered on the motherboard or coming as modules doesn’t really change the limitations of each architecture) which is what I was talking about back in the original post.



  • Just about all houses in Europe are made of brick.

    They’re no more expensive than houses in the US.

    Not saying which is better (frankly I don’t know), just pointing out that it really isn’t as straightforward as using brick rather than wood making US house prices go up - maybe in the past, but nowadays land prices, manpower costs and speculation are what drives the realestate prices.

    (After all, brick is basically baked clay, so hardly expensive stuff)

    Also as somebody else pointed out brick houses last significantly longer than wood houses.


  • There is no such thing as a “Jew memory” shared by all Jews.

    The individuals doing this shit certainly heard the stories as that stuff is constantly repeated and part of the national identity in Israel, but they didn’t feel the suffering and in fact most of them don’t even come from families which suffered in the Holocaust as most Israelis are not from families that came from Western Europe.

    Even when people felt the horror by being on the other side, they need to actually be able to empathise with their victims to not want to inflict that same horror on them: in other words they need to not be sociopaths or psychopaths and to see those victims as fellow human beings. Notice how in Israel, just like in Nazi Germany, the targetted ethnicity are constantly dehumanized with extreme racist statements about them, to the point that Palestinians in Israel are called “human animals” similarly to how Jews in Nazi Germany were “untermenschen” (sub-humans).

    With a population of people were 99.999% did not in fact suffer anything in the Holocaust (there is literally only a handful of people in Israel who were alive back then, living in Western Europe and who were caught by it), with an actual constitution stating the superiority of their race (literally “the chosen people”) and enshrining discriminatory treatment on race (the Israeli constitution says that only Jews can be Israeli Nationals), under constant extreme racist propaganda in most of the Press and in statements from politicians very literally dehumanizing the ethnicity they’re targetting (“human animals”) and with sociopaths and psychopaths not just part of the highest levels of government and the military but in general being given free reign to execute their most violent fantasies with no pushback as long as it’s against the “lesser” races, this kind of shit is not just unsurprising, it’s innevitable.




  • Hah, now you made me look that stuff up since I was talking anchored on my knowledge of systems with multiple CPUs and shared memory, since that was my expectation about the style of system architecture of the PS5, since in the past that’s how they did things.

    So, for starters I never mentioned “integrated memory”, I wrote “integrated graphics”, i.e. the CPU chip comes together with a GPU, either as two dies in the same chip package or even both on the same die.

    I think that when people talk about “integrated memory” what they mean is main memory which is soldered on the motherboard rather than coming as discrete memory modules. From the point of view of systems architecture it makes no difference, however from the point of view of electronics, soldered memory can be made to run faster because soldered connections are much closer to perfect than the mechanical contact connections you have for memory modules inserted in slots.

    (Quick explanation: at very high clock frequencies the electronics side starts to behave in funny ways as the frequency of the signal travelling on the circuit board gets so high and hence the wavelength size gets so small that it’s down to centimeters or even milimeters - around the scale of the length of circuit board lines - and you start getting effects like signal reflections and interference between circuit lines - because they’re working as mini antennas so can induce effects on nearby lines - hence it’s all a lot more messy than if the thing was just running at a few MHz. Wave reflections can happen in connections which aren’t perfect, such as the mechanical contact of memory modules inserted into slots, so at higher clock speeds the signal integrity of the data travelling to and from the memory is worse than it is with soldered memory whose connections are much closer to perfect).

    As far as I know nowadays L1, L2 and L3 caches are always part of the CPU/GPU die, though I vaguelly remember that in the old days (80s, 90s) memory cache might be in the form of dedicated SRAM modules on the motherboard.

    As for integrated graphics, here’s some reference for an Intel SoC (system on a chip, in this case with the CPU and GPU together in the same die). If you look at page 5 you can see a nice architecture diagram. Notice how memory access goes via the memory controller (lower right, inside the System Agent block) and then the SoC Ring Interconnect which is an internal bus connecting everything to everything (so quite a lot of data channels). The GPU implementation is the whole left side, the CPU is top and there is a cache slice (at first sight an L4 cache) at the bottom shared by both.

    As you see there, in integrated graphics the memory access doesn’t go via the CPU, rather there is a memory controller (and in this example a memory cache) for both and memory access for both the CPU and the GPU cores goes through that single controller and shares that cache (but lower level caches are not shared: notice how the GPU implementation contains its own L3 cache - bottom left, labelled “L3$”)

    With regards to the cache dirty problems I mentioned in the previous post, at least that higher level (L4) cache is shared so instead of cache entries being made invalid because of the main memory being changed outside of it, what you get is a different performance problem were there is competiton for cache usage between the areas of memory used by the CPU and areas of memory used by the GPU (as the cache is much smaller than the actual main memory, it can only contain copies of part of the main memory, and if two devices are using different areas of the main memory they’re both causing those areas to get cached but the cache can’t fit both so depending on the usage pattern it might constantly be ejecting entries for one area of memory to make room for entries for the other area of memory and back, which in practice makes it as slow as not having any cache there - there are lots of tricks to make this less of a problem but it’s still slower than if there was just one processing device using that cache such as you get with each processing device having its own cache and its own memory).

    As for contention problems, there are generally way more data channels in an internal interconnect as the one you see there than in the data bus to the main memory modules, plus that internal interconnect will be way faster, so the contention in memory access will be lower for cached memory but with cache misses (memory locations not in cache and hence that have to be loaded from main memory) that architecture will still suffer from two devices sharing the main memory hence that memory’s data channels having to be shared.


  • When two processing devices try and access the same memory there are contention problems as the memory cannot be accessed by two devices at the same time (well, sorta: parallel reads are fine, it’s when one side is writing that there can be problems), so one of the devices has to wait, so it’s slower than dedicated memory but the slowness is not constant since it depends on the memory access patterns of both devices.

    There are ways to improve this: for example, if you have multiple channels on the same memory module then contention issues are reduced to the same memory block, which depends on the block-size, though this also means that parallel processing on the same device - i.e. multiple cores - cannot use the channels being used by a different device so it’s slower.

    There are also additional problems with things like memory caches in the CPU and GPU - if an area of memory cached in one device is altered by a different device that has to be detected and the cache entry removed or marked as dirty. Again, this reduces performance versus situations where there aren’t multiple processing devices sharing memory.

    In practice the performance impact is highly dependent on if an how the memory is partitioned between the devices, as well as by the amount of parallelism in both processing devices (this latter because of my point from above that memory modules have a limited number of memory channels so multiple parallel accesses to the same memory module from both devices can lead to stalls in cores of one or both devices since not enough channels are available for both).

    As for the examples you gave, they’re not exactly great:

    • First, when loading models into the GPU memory, even with SSDs the disk read is by far the slowest part and hence the bottleneck, so as long as things are being done in parallel (i.e. whilst the data is loaded from disk to CPU memory, already loaded data is also being copied from CPU memory to GPU memory) you won’t see that much difference between loading to CPU memory and then from there to GPU memory and direct loading to GPU memory. Further, the manipulation of models in shared memory by the CPU introduces the very performance problems I was explaining above, namely contention problems from both devices accessing the same memory blocks and GPU cache entries getting invalidated because the CPU altered that data in the main memory.
    • Second, if I’m not mistaken tone mapping is highly parallelizable (as pixels are independent - I think, but not sure since I haven’t actually implemented this kind of post processing), which means that the best by far device at parallel processing - the GPU - should be handling it in a shader, not the CPU. (Mind you, I might be wrong in this specific case if the algorithm is not highly parallelizable. My own experience with doing things via CPU or via shaders running in the GPU - be it image shaders or compute shaders - is that in highly parallelizable stuff, a shader in the GPU is way, way faster than an algorithm running in the CPU).

    I don’t think that direct access by the CPU to manipulate GPU data is at all a good thing (by the reasons given on top) and to get proper performance out of a shared memory setup at the very least the programming must done in a special way that tries to reduce collisions in memory access, or the whole thing must be setup by the OS like it’s done on PCs with integrated graphics, were a part of the main memory is reserved for the GPU by the OS itself when it starts and the CPU won’t touch that memory after that.


  • I’ve both lived in the UK and The Netherlands.

    IMHO, it’s to do with how socially the UK is a very classist society were people worry a lot (insanely so compared with The Netherlands) not just about their place in the social ladder but about it being visible to others - the TV Sitcom Keeping Up Appearances is actually a pretty good illustration of this: even though it’s a comedy and thus exagerated in the forms the characters in it display such traits and act on them, the way of thinking of the characters is based on how people in Britain (especially England) tend to see their standing in society and the importance they give to projecting the “right” appearances (part of what makes that comedy funny is that it’s a satire of certain traits of British society: a lot of British comedy is even more funny once you’ve lived there for a while and start getting the in-jokes).

    Then overlayed on this is the common take there on social climbing which is to spend far more time and effort trying to stop others below oneself in the social ladder from climbing than in climbing oneself. People like to look down on those seen as lower status, expect others to “know their place” and will actually put some effort into making sure those who don’t are punished for it.

    This is, IMHO, why punishing the poor is so popular in Britain. It also anchors a lot of the anti-immigration feeling since there is no lower class in British Society than non-Britons.

    As for other Anglo-Saxon countries, I don’t really know.


  • Interestingly people who learned to use PCs back in the early days most likely installed themselves Windows on their own MS-DOS PCs and probably also upgraded it themselves, whilst Mac users did not.

    Which kinda gives weight to the idea that it’s the technical barrier to entry into using a certain OS that makes for tech savvy users of that OS: they had to be tech savvy already (or at least have the mindset of trying stuff out which is IMHO what creates tech savvy users) in order to get that OS running.


  • I think that’s a pretty recent phenomenon and it still requires that there’s a good friend or family member who is a Techie to actually happen.

    That said, thinking about your post does bring a whole “chicken and the egg” possibility to mind: are Linux users tech savvy because of the open nature of Linux or are Linux users tech savvy because for most people the technical barrier to entry into running Linux is still high enough that they have to be tech savvy to begin with in order to start running Linux?




  • Back when the Leave Referendum in Britain was won by Leave, in the rest of Europe the number of people who wanted a Leave Referendum instantly dropped by half in just one month and the far-right who until then were vocally anti-EU, suddenly stopped doing it.

    As I see it, judging by what’s happened in Canada and Australia as well as by the very overt distancing of Trump by most of the far-right in Europe, Americans are taking it in the chin for the rest of us.

    Now if only they themselves would learn that supporting fascism abroad also breeds fascism at home (I just found out that AOC voted in favour of a new bullshit “anti-semitism” definition from the Israeli lobby, so guess not even the supposedly “left” wing of the Democrat Party have yet to learn the lesson).