#AI and machine learning programs
Explore tagged Tumblr posts
Text
Artificial Intelligence and Machine Learning Courses in Bangalore: A Guide to Advancing Your Career
#Artificial Intelligence courses#Top AI courses#AI and machine learning programs#AI certification Bangalore#Best machine learning institutes#Machine learning certification#Career in Artificial Intelligence#AI and ML Courses
1 note
·
View note
Note
I just watched a video where someone is using ChatGPT to generate comments on their code. Even as a layman I feel like I should be screaming at him, but on a scale from 1 to apocalypse, how bad is this?
Machine-generated comments could not possibly be more useless, nonsensical or maliciously misleading than most of the human-generated comments I've seen.
1K notes
·
View notes
Text
I think with AI we kinda lost the plot, why am I always being advertised for another shitty generative AI that does basic shit any person can do themselves. No I don’t want to us AI to make a shopping list. Idk I think we should relegate that stuff to like, machine learning to predict protein structures or identifying cancer cells and stuff like that please
#and I have like a genuine interest in machine learning and programming#mostly in bio but still I’m just so confused about how we got here#ai
14 notes
·
View notes
Text
I desprately need someone to talk to about this
I've been working on a system to allow a genetic algorithm to create DNA code which can create self-organising organisms. Someone I know has created a very effective genetic algorithm which blows NEAT out of the water in my opinion. So, this algorithm is very good at using food values to determine which organisms to breed, how to breed them, and the multitude of different biologically inspired mutation mechanisms which allow for things like meta genes and meta-meta genes, and a whole other slew of things. I am building a translation system, basically a compiler on top of it, and designing an instruction set and genetic repair mechanisms to allow it to convert ANY hexadecimal string into a valid, operable program. I'm doing this by having an organism with, so far, 5 planned chromosomes. The first and second chromosome are the INITIAL STATE of a neural network. The number and configuration of input nodes, the number and configuration of output nodes, whatever code it needs for a fitness function, and the configuration and weights of the layers. This neural network is not used at all in the fitness evaluation of the organism, but purely something the organism itself can manage, train, and utilize how it sees fit.
The third is the complete code of the program which runs the organism. Its basically a list of ASM opcodes and arguments written in hexadecimal. It is comprised of codons which represent the different hexadecimal characters, as well as a start and stop codon. This program will be compiled into executable machine code using LLVM IR and a custom instruction set I've designed for the organisms to give them a turing complete programming language and some helper functions to make certain processes simpler to evolve. This includes messages between the organisms, reproduction methods, and all the methods necessary for the organisms to develop sight, hearing, and recieve various other inputs, and also to output audio, video, and various outputs like mouse, keyboard, or a gamepad output. The fourth is a blank slate, which the organism can evolve whatever data it wants. The first half will be the complete contents of the organisms ROM after the important information, and the second half will be the initial state of the organisms memory. This will likely be stored as base 64 of its hash and unfolded into binary on compilation.
The 5th chromosome is one I just came up with and I am very excited about, it will be a translation dictionary. It will be 512 individual codons exactly, with each codon pair being mapped between 00 and FF hex. When evaulating the hex of the other chromosomes, this dictionary will be used to determine the equivalent instruction of any given hex pair. When evolving, each hex pair in the 5th organism will be guaranteed to be a valid opcode in the instruction set by using modulus to constrain each pair to the 55 instructions currently available. This will allow an organism to evolve its own instruction distribution, and try to prevent random instructions which might be harmful or inneficient from springing up as often, and instead more often select for efficient or safer instructions.
#ai#technology#genetic algorithm#machine learning#programming#python#ideas#discussion#open source#FOSS#linux#linuxposting#musings#word vomit#random thoughts#rant
7 notes
·
View notes
Text
they're shaking hands honest
(ai/machine learning generated animations)
14 notes
·
View notes
Text

This is part of a new project I am doing for a Facebook app that can alert someone when there is suspicious activity on their account, and block people who post rude comments and hate speech using a BERT model I am training on a dataset of hate speech. It automatically blocks people who are really rude / mean and keeps your feed clean of spam. I am developing it right now for work and for @emoryvalentine14 to test out and maybe in the future I will make it public.
I love NLP :D Also I plan to host this server probably on Heroku or something after it is done.
#machine learning#artificial intelligence#python programming#programmer#programming#technology#coding#python#ai#python 3#social media#stopthehate#lgbtq community#lgbtqia#lgbtqplus#gender equality
74 notes
·
View notes
Text
OpenAI Releases Codex: A Software Agent that Operates in the Cloud and Can Do Many Tasks in Parallel
OpenAI has released a research preview of Codex, a cloud-based software engineering agent that's not just another code completion tool. Codex is a cloud-based software-engineering agent that turns on isolated sandboxes, pulls your repo, and chips away at features, bug fixes, test suites, and even pull-request boilerplates—often in parallel.
What is OpenAI Codex? 📌
→ Cloud-based software engineering agent
→ Can write features, answer codebase questions, run tests, and propose Pull Requests for review
→ Each task runs in its own isolated cloud environment
→ Provides detailed terminal logs, test outputs, and citations
→ Users can create AGENTS.MD files in their repository to instruct Codex on project-specific commands, testing procedures, and coding standards
→ Powered by codex-1
How to use Codex: 📌
→ Users can access Codex through the ChatGPT sidebar
→ Assign coding tasks by typing a prompt
→ Each request is handled independently
→ Codex can read and edit files and run commands like test suites, linters, and type checkers
→ Task completion generally takes between one and thirty minutes
Once done, Codex runs its changes within its sandboxed environment, which users can then review, ask for more changes, open a GitHub PR, or pull the changes into their local setup.
↗️ Full Read: https://aiagent.marktechpost.com/post/openai-releases-codex-a-software-agent-that-operates-in-the-cloud-and-can-do-many-tasks-in-parallel
Codex: Availability 📌
Codex is currently rolling out to ChatGPT Pro, Enterprise, and Team users, with access for Plus and Edu users planned to come soon.
#agentic ai#ai#ai agency#ai agents#artifical intelligence#chatgpt#codex#ChatGPT Codex#OpenAI Codex#coding#programming#engineering#software#machine learning#software engineering agent#software engineering#coding agent
5 notes
·
View notes
Text
why is everything called AI now. boy thats an algorithm
#'ubers evil AI will detect your phone battery is low and raise the prices accordingly' thats fucked up but it#doesnt need a machine learning program for that#and not algorithm like your social media feed or whatever like in the most basic sense of the definition
7 notes
·
View notes
Text
Learn how Mistral-NeMo-Minitron 8B, a collaboration between NVIDIA and Mistral AI, is revolutionizing Large Language Models (LLMs). This Open-Source model uses advanced pruning & distillation techniques to achieve top accuracy on 9 benchmarks while being highly efficient.
#MistralNeMoMinitron#AI#ModelCompression#OpenSource#MachineLearning#DeepLearning#NVIDIA#MistralAI#artificial intelligence#open source#machine learning#software engineering#programming#nlp
5 notes
·
View notes
Text
Truth speaking on the corporate obsession with AI
Hilarious. Something tells me this person's on the hellsite(affectionate)
#ai#corporate bs#late stage capitalism#funny#truth#artificial intelligence#ml#machine learning#data science#data scientist#programming#scientific programming
8 notes
·
View notes
Text
The Mathematical Foundations of Machine Learning
In the world of artificial intelligence, machine learning is a crucial component that enables computers to learn from data and improve their performance over time. However, the math behind machine learning is often shrouded in mystery, even for those who work with it every day. Anil Ananthaswami, author of the book "Why Machines Learn," sheds light on the elegant mathematics that underlies modern AI, and his journey is a fascinating one.
Ananthaswami's interest in machine learning began when he started writing about it as a science journalist. His software engineering background sparked a desire to understand the technology from the ground up, leading him to teach himself coding and build simple machine learning systems. This exploration eventually led him to appreciate the mathematical principles that underlie modern AI. As Ananthaswami notes, "I was amazed by the beauty and elegance of the math behind machine learning."
Ananthaswami highlights the elegance of machine learning mathematics, which goes beyond the commonly known subfields of calculus, linear algebra, probability, and statistics. He points to specific theorems and proofs, such as the 1959 proof related to artificial neural networks, as examples of the beauty and elegance of machine learning mathematics. For instance, the concept of gradient descent, a fundamental algorithm used in machine learning, is a powerful example of how math can be used to optimize model parameters.
Ananthaswami emphasizes the need for a broader understanding of machine learning among non-experts, including science communicators, journalists, policymakers, and users of the technology. He believes that only when we understand the math behind machine learning can we critically evaluate its capabilities and limitations. This is crucial in today's world, where AI is increasingly being used in various applications, from healthcare to finance.
A deeper understanding of machine learning mathematics has significant implications for society. It can help us to evaluate AI systems more effectively, develop more transparent and explainable AI systems, and address AI bias and ensure fairness in decision-making. As Ananthaswami notes, "The math behind machine learning is not just a tool, but a way of thinking that can help us create more intelligent and more human-like machines."
The Elegant Math Behind Machine Learning (Machine Learning Street Talk, November 2024)
youtube
Matrices are used to organize and process complex data, such as images, text, and user interactions, making them a cornerstone in applications like Deep Learning (e.g., neural networks), Computer Vision (e.g., image recognition), Natural Language Processing (e.g., language translation), and Recommendation Systems (e.g., personalized suggestions). To leverage matrices effectively, AI relies on key mathematical concepts like Matrix Factorization (for dimension reduction), Eigendecomposition (for stability analysis), Orthogonality (for efficient transformations), and Sparse Matrices (for optimized computation).
The Applications of Matrices - What I wish my teachers told me way earlier (Zach Star, October 2019)
youtube
Transformers are a type of neural network architecture introduced in 2017 by Vaswani et al. in the paper “Attention Is All You Need”. They revolutionized the field of NLP by outperforming traditional recurrent neural network (RNN) and convolutional neural network (CNN) architectures in sequence-to-sequence tasks. The primary innovation of transformers is the self-attention mechanism, which allows the model to weigh the importance of different words in the input data irrespective of their positions in the sentence. This is particularly useful for capturing long-range dependencies in text, which was a challenge for RNNs due to vanishing gradients. Transformers have become the standard for machine translation tasks, offering state-of-the-art results in translating between languages. They are used for both abstractive and extractive summarization, generating concise summaries of long documents. Transformers help in understanding the context of questions and identifying relevant answers from a given text. By analyzing the context and nuances of language, transformers can accurately determine the sentiment behind text. While initially designed for sequential data, variants of transformers (e.g., Vision Transformers, ViT) have been successfully applied to image recognition tasks, treating images as sequences of patches. Transformers are used to improve the accuracy of speech-to-text systems by better modeling the sequential nature of audio data. The self-attention mechanism can be beneficial for understanding patterns in time series data, leading to more accurate forecasts.
Attention is all you need (Umar Hamil, May 2023)
youtube
Geometric deep learning is a subfield of deep learning that focuses on the study of geometric structures and their representation in data. This field has gained significant attention in recent years.
Michael Bronstein: Geometric Deep Learning (MLSS Kraków, December 2023)
youtube
Traditional Geometric Deep Learning, while powerful, often relies on the assumption of smooth geometric structures. However, real-world data frequently resides in non-manifold spaces where such assumptions are violated. Topology, with its focus on the preservation of proximity and connectivity, offers a more robust framework for analyzing these complex spaces. The inherent robustness of topological properties against noise further solidifies the rationale for integrating topology into deep learning paradigms.
Cristian Bodnar: Topological Message Passing (Michael Bronstein, August 2022)
youtube
Sunday, November 3, 2024
#machine learning#artificial intelligence#mathematics#computer science#deep learning#neural networks#algorithms#data science#statistics#programming#interview#ai assisted writing#machine art#Youtube#lecture
4 notes
·
View notes
Text
every time one of my university classmates expresses genuine interest in utilizing ai, an angel gets taken out back and shot violently with a gun
#lixx rambles#how does this keep happening#WE'RE ANIMATION STUDENTS#WE HAVE CLASSES ON LEARNING TO DRAW#WHY DO YOU WANT THE STUPID ART STEALING MACHINE#“the programs are cracked at home and we cant use the ai” GOOD.#sorry gamers this pisses me off
1 note
·
View note
Text
anyone over here likes coding, machine learning and AI?
could I get some mutuals with these interests?
3 notes
·
View notes
Text
now we're cooking with gas
#this is a phd post#ai#machine learning#the model is using my graphics card like a proper ai model#codeblr#programming
9 notes
·
View notes
Text
youtube
#Aperture#video essay#algorithm#algorithms#Eric Loomis#COMPAS#thought piece#computer#computer program#data#data brokers#targeted ads#data breach#terminal#the silver machine#AI#machine learning#healthcare#tech#technology#profit#Youtube
2 notes
·
View notes
Text
Hey all, so the crowdfund is up for ReachAI. If anyone wants to go check it out it would mean a lot to me! Also you can watch the video there on IndieGOGO or here:
youtube
It should give you a bit of an idea on what ReachAI is and what the nonprofit will be doing as well as the benefits of becoming a donor (which there are even more than I talked about in the video including Webinars, 1-on-1 sessions with me, a newsletter update on research the organization is working on or right now that I am). I am excited to be bringing ReachAI closer to launch day, I am really hoping I can raise the money to get it started! I know it could do so much good in the world :3
#programming#programmer#artificial intelligence#machine learning#technology#python programming#coding#ai#python#programmers#data science#medical research#medical technology#aicommunity#aiinnovation#Youtube
24 notes
·
View notes