Skip Navigation
By 2035, falling satellites will kill or injure someone every two years, says FAA
  • Who wouldn't? They are doing some of the most advanced rocket science on the planet. Of course, trusting corporations statements and research is an entire topic of it's own. Taking Elon Musk seriously on the other hand...

  • A King regardless
  • Thanks! Both look like very decent studies so I am not certain where the difference comes from. I suspect that the division into age brackets, or averaging across all of the them may be the cause. Either way, it seems that the effects of being slightly overweight are barely statistically significant. The more you know

  • every day I check my email for the tangible confirmation that I am a failure
  • And when someone says "dream job" they are referring to the semantically correct meaning of the word? I have my doubts. When people say they're dream job consists of doing something, like "helping people", I think it is the "work" that interests them, and not the financial details.

    What you call sick is only sick if you take your awfully correct definition, which I honestly don't think correlates well with what people mean with it.

    Thats also why I would still tend to agree with you, because I dont believe in laboring for some bosses benefit either. But certainly not with the initial wording

  • every day I check my email for the tangible confirmation that I am a failure
  • Nah thats bullshit. And this is coming from somebody who would tend to agree with you, but you can't always be so excruciatingly black and white. For example, my dream job is what I do in my free time, except in a non-profit organisation where I am not chained by an individual lack of resources. Some work furthers humanity. Some work is completely voluntary. Sometimes a dream job is that way to scratch the biological itch to keep our brains busy.

    Additionally, this supports the bullshit capitalist argument that people wouldnt want to work anymore if not coerced into it. I believe people would still dream of doing important jobs that help humanity even out of their own free will.

  • Starlink quietly lost over $250,000,000 in burned satellites this summer.
  • The difference between an orbit that lasts 5 years and one that lasts a hundred is approximately 100-200km, the limit is quite sharp and actually quite tricky to get exactly right. That will cost you about a millisecond or two in latency tops. It is more likely that SpaceX is required to adhere to rules made by the FCC/FAA.

  • Cancer cases in people below 50 up nearly 80 percent in last three decades
  • Well, the article refers to both :)

    I think you'd be right about the "number of diagnoses" statement in the title, but I think the discussion is about the deaths due to cancer, which have also increased and would not have as strong of a correlation for the reasons others mentioned

  • Deleted
    Fanart is Art
  • What the hell is this? I am wondering if the people in this picture ever even met boys and men that wrote fan fiction, because it sure as hell never was cool. Writing in general in many genres like romance, poetry - and of course fan fiction - got young men I knew bullied. The girls I knew also were made of fun of for it, but typically less so. Except for creative writing being more normalized for women in the cultures I have experienced, I would argue this is a gender agnostic issue. The later posts get it, imo

  • US sues Elon Musk’s SpaceX for alleged hiring discrimination against refugees
  • From briefly having worked on a project where this was a relevant issue, and I had to throw good people of foreign nationality off the team due to higher up NASA decisions: ITAR also becomes relevant when you want to access data and hardware that is ITAR regulated for use in your mission. This is the case for all space missions -- even for SpaceX, who likes to do things in-house -- since the advanced electronics, alloys, etc. will come from elsewhere and fall under regulation.

  • Bug Fact rule
  • Cool, didn't think of that one. But it would still work, since you could consider that a constant in front of the f(x) not raised to the nth power (easier to imagine if we have a constant function, then its just (b-a)). The nth root will then normalise it to 1 for any real factor.

  • A very dairy meme
  • I don't know a single person who consumes milk because they think they require it. They just like the taste of dairy products.

    The subsidization is an issue imo, but I don't think people are as brainwashed regarding milk as you assume.

  • ULTRARAM may be a silly name but it's the holy grail for memory tech and means your PC could hibernate for over 1,000 years
  • It should be fine for normal use cases when used with error correcting codes without any active scrubbing.

    According error rates for ECC RAM (which should be at least by an order of magnitude comparable) of 1 bit error per gigabyte of RAM per 1.8 hours1, we would assume ~5000 errors in a year. The average likelyhood of hitting an already affected byte is approx. (5000/2)/1e9=2e-6. So that probability * 5000 errors is about a 1.2 percent chance that two errors occur in one byte after a year. It grows exponentially once you start going a past a year. But in total, I would say that standard error correcting codes should be sufficient to catch all errors, even if in hibernation for a whole year.

    [1] https://en.wikipedia.org/wiki/ECC_memory

  • ULTRARAM may be a silly name but it's the holy grail for memory tech and means your PC could hibernate for over 1,000 years
  • TMR (so the tripilicate method) wouldn't be super suitable for this kind of application since it is a bit overkill in terms of redundancy. Just from an information theory perspective, you should only have enough parity suitable for the amount of corruption you are expecting (in this case, not a lot, maybe a handful of bits after a year or two). TMR is optimal for when you are expecting the whole result to be wrong or right, not just corrupted. ECC and periodic scrubbing should be suitable for this. That is what is done by space-grade processors and RAM.

  • ULTRARAM may be a silly name but it's the holy grail for memory tech and means your PC could hibernate for over 1,000 years
  • The gold around satellites are actually very thin layers of mylar, aluminum foil and kapton (a type of golden, transparent plastic) which are used to keep heat inside the satellite inside, and heat outside, outside (See Multi-Layer Insulation). Radiation shielding usually comes from the aluminum structural elements of the spacecraft, or is close to the electronics so you do not waste too much mass on shielding material. Basically, shielding efficacy is most determined by its thickness, so it quickly becomes quite heavy.

  • What would 128 bits computing look like?
  • Indeed, because those two things were only exemplary, meaning they would be indicative of your system having a bottleneck in almost all types workloads. Supported by the generally higher perforance in 64-bit mode.

  • What would 128 bits computing look like?
  • Clearly you can address more bytes than your data bus width. But then why all the "hacks" on 32-bit architectures? Like the 36-bit address bus via memory mapping on SPARCv8 instead of using paired index registers ( or ARMv7 width LPAE). From a perfomance perspective using an address width that is not the native register width/ internal data bus width is an issue. For a significant subset of operations multiple instructions are required instead of one.

    Also is your comment about turing completeness to be taken seriously? We are talking about performance and practicality. Go ahead and crunch on some 64-bit floats using purely 8-bit arithmetic operations (or even using vector registers). Of course you can, but the point is that a suitable word size is more effective for certain computational tasks. For operations that are done frequently, they should ideally be done at native data-bus width. Vectored operations will also cost performance.

  • What would 128 bits computing look like?
  • I am unsure about the historical reasons for moving from 32-bit to 64-bit, but wouldnt the address space be a significantly larger factor? Like you said, CPUs have had vectoring instructions for a long time, and we wouldn't move to 128-bit architectures just to be able to compute with numbers of those size. Memory bandwidth is, also as you say, limited by the bus widths and not the processor architecture. IMO, the most important reason that we transitioned to 64-bit is primarily for the larger address address space without having to use stupidly complex memory mapping schemes. There are also some types of numbers like timestamps and counters that profit from 64-bit, but even here I am not sure if the conplex architecture would yield a net slowdown or speedup.

    To answer the original question: 128 bits would have no helpful benefit for the address space (already massive) and probably just slow everyday calculations down.

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)LT
    lte678 @feddit.de
    Posts 0
    Comments 23