averagelolcodeenthusiast
averagelolcodeenthusiast
Untitled
4 posts
Last active 60 minutes ago
Don't wanna be here? Send us removal request.
averagelolcodeenthusiast · 30 days ago
Text
god forbid, i want to be praised and degraded at the same time 🙄
443 notes · View notes
averagelolcodeenthusiast · 30 days ago
Text
morning horny is crazy why is my clit throbbing at 7 am?? I'm too cozy to deal with that
6K notes · View notes
averagelolcodeenthusiast · 2 months ago
Text
Tumblr media
my accurately labelled map of america. how did i do chat
330 notes · View notes
averagelolcodeenthusiast · 10 months ago
Text
I think the most important thing to understand with ML (AI) is that it is, fundamentally, unexplainable intelligence, or intelligence without understanding. What I mean by this is that it solves a problem in a non-algorithmic way, a way that can't be properly explained or duplicated. However, although this may sound like a "bad" thing, this is a actually where ML gets its strengths and it's really no different from how our brains work. We can't explain how we see or come up with words or do pretty much any fundamental mental task. Seeing is fundamentally too complicated of an operation to be done generally and algorithmically. It's the same case with ML. For instance, meteorologists can (imperfectly) predict the weather by using a simulation, which takes our understanding of physics and applies it to the atmosphere to predict how it will change. This is understandable intelligence: we understand how each component works and how they combine to create predictions. However, this approach can only take in as many factors as we feed into it. Using ML on the other hand, we can account for factors that we didn't even know existed. The famous butterfly thought experiment says that, because weather is this massive chaotic system, that a tiny disturbance like a butterfly flapping its wings could lead to a hurricane further down the line. Although ML can't account for something as small and random as a butterfly, it can account for an almost infinite number of other factors, like maybe the migratory patterns of the Monarch butterflies, or the influence of small but regular ocean currents on air temperatures. However, the ML model doesn't know that butterflies are migrating, it just knows that there is a disturbance at a specific place and time of year and that when there is such a disturbance it will lead to a set of probable consequences. The very fact of it's non-understanding allows it to account for millions of tiny factors, which leads to more accuracy than could ever be achieved through traditional methods.
2 notes · View notes