There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

person594

@[email protected]

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Do cosmologists know for sure that the Big Bang is propelling all matter away?

Is it at all possible that instead of being pushed away, we are instead getting pulled toward something huuuuuge via gravity? As if we are falling into something way greater than ourselves? I thought this was a wild idea but after I Googled it I found out that there is such a thing as a “Great Attractor”. Something 150...

person594 ,

Kind of a tangent at this point, but there is a very good reason that that couldn’t be the case: according to the shell theorem , nowhere in the interior of a spherical shell of matter experiences any net gravitational force – wherever you are inside the sphere, the forces counteract exactly.

Otherwise, though, the metric expansion of space is different from typical movement, and it isn’t correct to say that things or being pushed or pulled. Rather, the distance between every pair of points in the universe increases over time, with that rate of increase proportional to the points’ distance. Importantly, for very distant points, the distance can increse faster than the speed of light, which would be disallowed by any model which describes the expansion in terms of objects moving in a traditional sense.

AI language models can exceed PNG and FLAC in lossless compression, says study (arstechnica.com)

While LLMs have been used for… a lot, it seems like this use might be one where it’s not only reliable but it appears to outperform existing methods of image compression. Being able to cram more data into less space tends to lead to interesting developments, so I will be keeping my eye on this....

person594 ,

That isn’t really the case; while many neural network implementations make nondeterministic optimizations, floating point arithmetic is in principle entirely deterministic, and it isn’t too hard to get a neural network to run deterministically if needed. They are perfectly applicable for lossless compression, which is what is done in this article.

person594 ,

Let’s just outlaw racism too while we’re at it!

person594 ,

Have you tried looking between two Casimir plates?

person594 ,

Unix time is just the number of seconds since January 1 1970, isn't it? How is that base 10, or any other base? If anything, you might argue it's base 2, since computers generally store integers in binary, but the definition is base-independent afaik.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines