#COBOL programming jobs
Explore tagged Tumblr posts
Text
Legacy Code, Modern Paychecks: The Surprising Demand for Antique Programming Languages
Why old-school tech skills like COBOL and Fortran are still landing high-paying gigs in 2025 In a tech world obsessed with the latest frameworks and cutting-edge AI, it might surprise you to learn thereâs still an active â and even lucrative â market for what many would call antique programming languages. Weâre talking COBOL, Fortran, Ada, LISP, and other veterans of computing history. But isâŚ
#antique programming languages#COBOL and Fortran careers#COBOL programming jobs#IT Consulting#IT Contracting#legacy code developers
0 notes
Text
Why You Should Get Yourself a Job with Legacy Programming Languages
Key Takeaways âOldâ languages are still relevant: FORTRAN, COBOL, and Pascal continue to play crucial roles in specific domains. FORTRAN: Excels in scientific computing and high-performance computing. COBOL: Remains essential in financial systems for handling large datasets and transactions. Pascal: Influenced modern languages and continues to be used in education. Career opportunitiesâŚ
#career#COBOL#computer science#computers#dailyprompt#education#finance#FORTRAN#hacking#history#jobs#legacy systems#Pascal#Programming#scientific computing#technology
0 notes
Text
i love it when my payment systems are "extraordinarily sophisticated" and "there are few people who understand them well"

i promise we don't need to go to bat for what is almost certainly the most nightmarish legacy codebase currently extant in the world. why do people think the failure mode of software is like. holes being punched physically through the source
#COBOL isn't even hard smh#it's just no one want to learn a skill that gets them a 45k a year programming job#most âmission-criticalâ COBOL systems have a bus number of 1 or 0 its just no one gives a shit if they work
55 notes
¡
View notes
Note
hi, it's the anon who's stopping by! how are you doing? I hope you're having a nice, relaxing weekend! đ
things over here are... same old same old, I guess. I'm starting to get frustrated at myself for not being over this whole situation, since it's been almost half a year now since it all started, but it is what it is. there are additional circumstances that make it harder to overcome it all, but still I can't help but get angry at myself sometimes.
anyway that's not why I wanted to stop by, sorry. I saw your post about how sometimes your job feels like herding cats, having to manage a team of programmers, and it made me laugh because it felt a bit like hearing from the other side. I'm also a programmer, and my team has a project owner whom I'm sure we must drive up the wall sometimes with things like the ones you mentioned. my teammates and I need to get him a thank you present for putting up with us. I'm sure your team also appreciates all the effort you put into wrangling them and keeping them on track!
I hope you have a lovely weekend, as always I'm sending you lots of good thoughts and I hope that things keep looking up!! đ
hi anon, good to see you again! it was in fact a nice weekend (and a very busy one, considering i'm just now catching up with tumblr!) I was visiting some family who lives a little out of town and haha, as often happens, there was drama. ngl i was glad to get home sunday night and have some alone time.
ooof, i can totally relate to the whole not being over a crappy situation. i've been trying to tell myself that it's in my best interests to accept as much as i can and not dwell on it, because otherwise i'll be the one of main forces making myself miserable. but it is not easy! i am trying not to be too hard on myself when it doesn't work out, and I'm sending those same vibes to you!
that's so funny about the programming thing! truly two sides of the same coin. it's definitely a symbiotic relationship; they (hopefully) appreciate me keeping things on track and i definitely appreciate them wanting to make everything work in the most optimal way it can.
also yay, computer high five! (fun fact: i had to learn cobol as part of my undergrad program. i've forgotten most of the details but i like to think that in a worst-case scenario, i can brush up and get a mainframe job lol)
i hope you had a nice weekend!
#anonymous#thanks for the ask!#sometimes i really miss the problem solving/debugging aspect of progamming#so much more fun than dealing with executive management meetings!#stopping by anon
2 notes
¡
View notes
Text
"I am retired from a very good tech job and recently I created a perfect COBOL program using CoPilot and I did it in less than 5 minutes. Yes, tech experts are in danger as their own AI tools challenge their employment."
Again AI is touching us softly on r lips and kissing us goodbye.
0 notes
Text
IF YOU INVEST AT 20 AND THE COMPANY IS STARTING TO APPEAR IN THE MAINSTREAM
8x 5% 12. Many of our taboos future generations will laugh at is to start with. But like VCs, they invest other people's money makes them doubly alarming to VCs. If it isn't, don't try to raise money, they try gamely to make the best case, the papers are just a formality. Understand why it's worth investing in. But at each point you know how you're doing. Only a few companies have been smart enough to realize this so far. If you run out of money, you probably never will. Just as our ancestors did to explain the apparently too neat workings of the natural world. Genes count for little by comparison: being a genetic Leonardo was not enough to compensate for having been born near Milan instead of Florence. The last one might be the most plausible ones. And yet a lot of other domains, the distribution of outcomes follows a power law, but in startups the curve is startlingly steep.
The list is an exhaustive one. I can't tell is whether they have any kind of taste. And if so they'll be different to deal with than VCs. The people are the most important of which was Fortran. It is now incorporated in Revenge of the Nerds. I have likewise cavalierly dismissed Cobol, Ada, Visual Basic, the IBM AS400, VRML, ISO 9000, the SET protocol, VMS, Novell Netware, and CORBA, among others. When people first start drawing, for example, because Paypal is now responsible for 43% of their sales and probably more of their growth. We fight less. You tell them only 1 out of 100 successful startups has a trajectory like that, and they have a hard time getting software done. What if some idea would be a remarkable coincidence if ours were the first era to get everything just right. In hacking, this can literally mean saving up bugs. I know that when it comes to code I behave in a way that seems to violate conservation laws.
Few would deny that a story should be like life. Steve Wozniak wanted a computer, Google because Larry and Sergey found, there's not much of a market for ideas. For a painter, a museum is a reference library of techniques. For a long time to work on as there is nothing so unfashionable as the last, discarded fashion, there is something even better than C; and plug-and-chug undergrads, who are both hard to bluff and who already believe most other investors are conventional-minded drones doomed always to miss the big outliers. As in any job, as you continue to design things, these are not just theoretical questions. But evidence suggests most things with titles like this are linkbait. Almost every company needs some amount of pain. I'd find something in almost new condition for a tenth its retail price at a garage sale.
Once you phrase it that way, the answer is obvious: from a job. A company that grows at 1% a week will in 4 years be making $25 million a month. You feel this when you start. Starting a startup is committing to solve any specific problem; you don't know that number, they're successful for that week. For example, when Leonardo painted the portrait of Ginevra de Benci, their attention is often immediately arrested by it, because our definition of success is that the business guys choose people they think are good programmers it says here on his resume that he's a Microsoft Certified Developer but who aren't. After they merged with X. Once investors like you, you'll see them reaching for ideas: they'll be saying yes, and you have to understand what they need. Just wait till all the 10-room pensiones in Rome discover this site. You're better off if you admit this up front, and write programs in a way that allows specifications to change on the fly. Working from life is a valuable tool in painting too, though its role has often been misunderstood. The founders can't enrich themselves without also enriching the investors. You're committing not just to intelligence but to ability in general, you can not only close the round faster, but now that convertible notes are becoming the norm, actually raise the price to reflect demand.
Most investors are genuinely unclear in their own minds why they like or dislike startups. Actor too is a pole rather than a threshold. But here again there's a tradeoff between smoothness and ideas. Starting startups is not one of them. The classic way to burn through cash is by hiring a lot of this behind the scenes stuff at YC, because we invest in such a large number of companies, and we invest so early that investors sometimes need a lot of founders are surprised by it. In the original Java white paper, Gosling explicitly says Java was designed not to be too difficult for programmers used to C. And this team is the right model, because it coincided with the amount. Those are the only things you need at first.
Not always. And so an architect who has to build on a difficult site, or a programming language is obviously doesn't know what these things are, either. One reason this advice is so hard to follow is that people don't realize how hard it was to get some other company to buy it. You can see that in the back of their minds, they know. But that's still a problem for big companies, because they seem so formidable. It's an interesting illustration of an element of the startup founder dream: that this is a coincidence. They try to convince with their pitch. In most fields the great work is done early on.
This is supposed to be the default plan in big companies. The people you can say later Oh yeah, we had to interrupt everything and borrow one of their fellow students was on the Algol committee, got conditionals into Algol, whence they spread to most other languages. This is in contrast to Fortran and most succeeding languages, which distinguish between expressions and statements. And if it isn't false, it shouldn't be suppressed. I mentioned earlier that the most successful startups seem to have done it by fixing something that they thought ugly. In 1989 some clever researchers tracked the eye movements of radiologists as they scanned chest images for signs of lung cancer. Darwin himself was careful to tiptoe around the implications of his theory. Running a business is so much more enjoyable now. Don't worry what people will say. Growth is why it's a rational choice economically for so many founders to try starting a startup consists of. If there are x number of customers who'd pay an average of $y per year for what you're making, then the total addressable market, or TAM, of your company, if they can get DARPA grants.
Fortunately, more and more startups will. Good design is often slightly funny. Unconsciously, everyone expects a startup to work on technology, or take venture funding, or have some sort of exit. And I'm especially curious about anything that's forbidden. Angels would invest $20k to $50k apiece, and VCs usually a million or more. Nowadays Valley VCs are more likely to take 2-3x longer than I always imagine. In the mid twentieth century there was a lot less than the 30 to 40% of the company you usually give up in one shot. A deals would prefer to take half as much stock, and then just try to hit it every week. What's wrong with having one founder? Within the US car industry there is a kind of final pass where you caught typos and oversights.
#automatically generated text#Markov chains#Paul Graham#Python#Patrick Mooney#Java#Valley#price#Many#everything#VRML#things#founder#startups#trajectory#count#generations#default#attention#sale#undergrads#lot#way#anything#eye#languages#car#exit#distribution
0 notes
Text
What is COBOL Mainframe Programming?
COBOL (Common Business Oriented Language) is one of the oldest high-level programming languages, specifically designed for business, finance, and administrative systems. Developed in the late 1950s and officially standardized in 1960, COBOL was created to provide a platform-independent language for data processing and large-scale transaction management.
The Role of COBOL in Mainframe Systems
Mainframes are powerful computers designed to handle and process massive amounts of data and transactions, often in real-time. COBOL serves as the backbone of mainframe systems, powering critical applications in industries such as banking, insurance, healthcare, government, and retail.
Key Features of COBOL Programming
Readable Syntax:Â COBOL uses English-like syntax, making it easier for non-programmers to understand and maintain the code.
Scalability:Â It efficiently handles vast amounts of transactions and data.
Legacy Integration:Â COBOL systems integrate seamlessly with legacy systems, which are still widely used in critical operations.
Batch and Real-time Processing:Â COBOL supports both batch processing (handling large data jobs) and real-time processing (instantaneous transaction execution).
Precision and Accuracy:Â The language excels in financial computations, ensuring reliable results.
Why COBOL Remains Relevant Today
Despite being decades old, COBOL continues to run a significant percentage of global financial transactions. Its reliability, efficiency, and ability to handle large-scale data processing make it irreplaceable in many legacy systems. Additionally, modern COBOL compilers and integration tools have allowed developers to modernize COBOL applications without rewriting entire systems.
Career Opportunities in COBOL Mainframe Programming
Due to the reliance on legacy systems, COBOL programmers remain in demand. Roles in COBOL programming typically involve maintaining and enhancing existing systems, migrating legacy systems to modern platforms, and ensuring uninterrupted operations of mission-critical applications.
Conclusion
COBOL mainframe programming is far from obsolete. As businesses continue to rely on stable, secure, and efficient transaction processing systems, COBOL programmers will play an essential role in maintaining and innovating these platforms. Whether you're an aspiring developer or an experienced programmer, learning COBOL can open doors to unique opportunities in industries that drive global economies.
1 note
¡
View note
Text
programming vent
my favorite thing about programming is how much visual noise web development has. like i'm a back-end dev because the registers and compilers speak to me. but whenever i try and learn web development, the course instructor always has like 15 files in vscode with some hard to read font and no semantic highlighting.
like wtf. is every front-end dev a neurotypical or something? i need to remove code comments and any code non-essential to function in order to understand libraries or codebases. i'd love to be at the coffee shop writing code but i literally cannot tune out the sound of every single conversation or the coffee maker. this is an unfortunate side effect of how my brain works. i have yet to solve the cocktail party problem.
if i use someone else's code i always feel stabbed in the back because it has x shortcoming that everyone is just okay with and will do the mental gymnastics to defend it and say it's good enough that my calculator built on redis and angular consumes 5gb of space between both the host and the browser and that sometimes keystrokes are just missed. like??? WHAT.
on the flip side i will go write everything myself out of discontentment and it's just what i need and it wasn't hard, but i can't stay focused for more than 5 minutes at a time. i've got that john carmack flavored autism. this is hell man. i'm probably a real extreme case of programming but cmon the job i just came from is some "the banks are run on COBOL" level shit.
none of it should work and inevitably it doesn't! it doesn't scale at all. but companies would rather hire 20 people to power a component team and keep adding bandaids than spend half of their salaries to sit down with 5 and rewrite decrepit shit from the ground up. but no, we gotta do r&d on adding 2fa to our account system because we'll get a ⨠tax break â¨.
1 note
¡
View note
Text

Computer science has undergone remarkable transformations since its inception, shaping the way we live, work, and interact with technology. Understanding this evolution not only highlights the innovations of the past but also underscores the importance of education in this ever-evolving field. In this blog, weâll explore key milestones in computer science and the learning opportunities available, including computer science training in Yamuna Vihar and Computer Science Training Institute in uttam nagar .
The Early Years: Foundations of Computing
The story of computer science begins in the mid-20th century with the development of the first electronic computers. The ENIAC, one of the earliest general-purpose computers, showcased the capabilities of machine computation. However, programming at that time required a deep understanding of machine language, which was accessible only to a select few.
Milestone: High-Level Programming Languages
The 1950s marked a pivotal moment with the introduction of high-level programming languages like FORTRAN and COBOL. These languages allowed developers to write code in a more human-readable form, significantly lowering the barrier to entry for programming. This shift made software development more approachable and laid the groundwork for future innovations.
The Personal Computer Revolution
The 1970s and 1980s ushered in the era of personal computing, with companies like Apple and IBM bringing computers into homes and offices. This democratization of technology changed how people interacted with computers, leading to the development of user-friendly interfaces and applications.
Milestone: The Internet Age
The rise of the internet in the late 20th century transformed communication and information sharing on a global scale. The introduction of web browsers in the 1990s made the internet accessible to the masses, resulting in an explosion of online content and services. This era emphasized the importance of networking and laid the foundation for the digital economy.
nd Advanced Technologies
As computing technologies became more advanced, the need for specialized knowledge grew. Understanding data structures and algorithms became essential for optimizing code and improving software performance.Â
Specialization a
For those looking to enhance their skills, the Data Structure Training Institute in Yamuna Vihar offers comprehensive programs focused on these critical concepts. Mastering data structures is vital for aspiring developers and can significantly impact their effectiveness in real-world applications.
Milestone: Mobile Computing and Applications
The advent of smartphones in the early 2000s revolutionized computing once again. Mobile applications became integral to daily life, prompting developers to adapt their skills for mobile platforms. This shift highlighted the need for specialized education in app development and user experience design.
Current Trends: AI, Big Data, and Cybersecurity
Today, fields like artificial intelligence (AI), big data, and cybersecurity are at the forefront of technological innovation. AI is transforming industries by enabling machines to learn from data, while big data analytics provides insights that drive decision-making.
To prepare for careers in these dynamic fields, students can enroll in an advanced diploma in computer application in Uttam Nagar. This program equips learners with a strong foundation in software development, data management, and emerging technologies.
Additionally, Computer Science Classes in Uttam Nagar offer tailored courses for those seeking to specialize in specific areas, ensuring that students are well-prepared for the job market.
Conclusion
The evolution of computer science has been marked by significant milestones that have reshaped our technological landscape. As the field continues to advance, the demand for skilled professionals is higher than ever. By pursuing education in computer scienceâwhether through computer science training in Yamuna Vihar, specialized data structure courses, or advanced diploma programsâyou can position yourself for success in this exciting and ever-changing industry.
Embrace the opportunities available to you and become a part of the future of technology!
#computer science classes#datascience#computer science training in Yamuna Vihar#Computer Science Classes in Uttam Nagar
0 notes
Text
The Evolution and Impact of Software Development
Software development is a dynamic and rapidly evolving field that plays a crucial role in modern society. It encompasses the processes involved in creating, designing, deploying, and maintaining software systems. From the early days of simple programming to the current landscape of complex, integrated systems, software development has transformed how businesses operate and how individuals interact with technology.

A Brief History
The history of software development dates back to the mid-20th century with the invention of early computers. The first software was written in machine language, a tedious and error-prone process. With the development of assembly languages and high-level programming languages such as Fortran, COBOL, and later, C and Java, the process became more manageable and efficient.
In the 1980s and 1990s, the rise of personal computers and the internet revolutionized software development. This era saw the birth of the software industry as we know it, with companies like Microsoft and Apple leading the charge. The introduction of graphical user interfaces (GUIs) made software more accessible to the general public, further accelerating the industry's growth.
Modern Software Development Practices
Today, software development is characterized by several key practices and methodologies that enhance productivity, quality, and collaboration. Some of the most significant advancements include:
1. Agile Methodologies
Agile methodologies, such as Scrum and Kanban, have transformed how software is developed. Agile emphasizes iterative development, where software is built in small, incremental steps. This approach allows for continuous feedback, rapid adaptation to changes, and early delivery of valuable features. Agile methodologies promote collaboration among cross-functional teams, ensuring that all stakeholders are involved throughout the development process.
2. DevOps
DevOps is a set of practices that combines software development (Dev) and IT operations (Ops). It aims to shorten the software development lifecycle and deliver high-quality software continuously. DevOps practices include continuous integration (CI), continuous delivery (CD), and infrastructure as code (IaC). These practices enhance collaboration between development and operations teams, automate repetitive tasks, and improve deployment processes.
3. Cloud Computing
Cloud computing has revolutionized software development by providing scalable, on-demand resources. Platforms like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) offer a wide range of services, from infrastructure to machine learning tools. Cloud computing enables developers to build, test, and deploy applications more efficiently and cost-effectively. It also facilitates collaboration and remote work, allowing teams to access resources and collaborate from anywhere in the world.
4. Artificial Intelligence and Machine Learning
Artificial intelligence (AI) and machine learning (ML) are becoming integral parts of modern software development. AI and ML enable the creation of intelligent applications that can learn from data, make predictions, and automate complex tasks. These technologies are used in various domains, including healthcare, finance, and entertainment, to enhance decision-making, personalize user experiences, and optimize operations.
The Impact of Software Development
Software development has a profound impact on various aspects of society and the economy:
1. Economic Growth
The software industry is a significant driver of economic growth. It creates jobs, fosters innovation, and enables the digital transformation of businesses. Software solutions streamline operations, reduce costs, and improve efficiency, contributing to increased productivity and competitiveness.
2. Social Change
Software development has transformed how people communicate, access information, and entertain themselves. Social media platforms, messaging apps, and streaming services have reshaped social interactions and entertainment consumption. Educational software and e-learning platforms have made education more accessible, especially in remote and underserved areas.
3. Healthcare
In healthcare, software development has led to advancements in medical research, patient care, and administration. Electronic health records (EHRs), telemedicine, and health monitoring apps are just a few examples of how software solutions improve patient outcomes and streamline healthcare services.
4. Environmental Impact
Software development also plays a role in addressing environmental challenges. Smart grid technology, renewable energy management systems, and environmental monitoring applications are examples of how software solutions contribute to sustainable development and environmental conservation.
The Future of Software Development
The future of software development is exciting and full of potential. Emerging technologies such as quantum computing, blockchain, and augmented reality (AR) are poised to redefine the field. Quantum computing promises to solve complex problems that are currently intractable for classical computers. Blockchain offers new possibilities for secure and transparent transactions, while AR and virtual reality (VR) are set to revolutionize user experiences and interactions.
Moreover, the increasing focus on cybersecurity, data privacy, and ethical considerations will shape the future of software development. As technology continues to advance, developers will need to address these challenges to build secure, trustworthy, and ethical software solutions.
Conclusion
Software development is a cornerstone of the digital age, driving innovation and transformation across all sectors. From its humble beginnings to its current state as a sophisticated and essential industry, software development continues to evolve, pushing the boundaries of what is possible. As we look to the future, the potential for further advancements and their impact on society is limitless, promising a world where technology continues to enhance and enrich our lives.
Get in touch today to kickstart your digital journey with AYB Infotech!
Email: [email protected] Phone: 02030265160 Website:Â www.aybinfotech.com Address: - 167-169 Great Portland street, 5th Floor, London, W1W 5PF
#appdevelopment#applaunch#appstore#digitalstrategy#digitaltransformation#mobilemarketing#mobiletech#onlinebusiness#digitalmarketing#innovation#Website Development#Mobile App Development#eCommerce Website Development#Software development
0 notes
Text
Rajeev Lakhanpal's Perspective on the Evolution of Computer Programming Languages

In the vast landscape of computer programming, the evolution of programming languages stands as a testament to the relentless pursuit of efficiency, expressiveness, and adaptability. Rajeev Lakhanpal, a renowned computer scientist and expert in programming languages, offers invaluable insights into this evolutionary journey. His perspective sheds light on the pivotal moments, paradigm shifts, and future trajectories that have shaped the diverse ecosystem of programming languages we navigate today.
Origins and Early Development
Lakhanpal traces the origins of programming languages to the early days of computing when pioneers like Ada Lovelace and Alan Turing laid the groundwork for computational thinking. With the advent of assembly languages, programmers could directly interact with the hardware, albeit in a low-level manner. However, the need for higher-level abstractions led to the development of languages like Fortran, Lisp, and COBOL, each catering to specific domains and programming paradigms.
Paradigm Shifts and Language Design
Throughout the decades, Lakhanpal observes significant paradigm shifts influencing language design. The emergence of structured programming in the 1960s brought forth languages like C and Pascal, emphasizing clarity and modularity. Object-oriented programming (OOP) gained prominence in the 1980s with languages such as Smalltalk and C++, enabling the encapsulation of data and behavior within objects. Concurrently, functional programming languages like ML and Haskell introduced novel concepts such as immutability and higher-order functions, challenging traditional imperative approaches.
The Rise of Domain-Specific Languages (DSLs)
Lakhanpal recognizes the rise of domain-specific languages as a pivotal trend in language evolution. DSLs, tailored to specific problem domains, offer unparalleled expressiveness and efficiency. Whether it's SQL for database queries, MATLAB for numerical computations, or HTML/CSS for web development, DSLs empower developers to express domain concepts concisely, fostering productivity and innovation.
The Age of Polyglot Programming
In today's diverse technological landscape, Lakhanpal emphasizes the prevalence of polyglot programming. Developers leverage a multitude of languages, frameworks, and tools to address varying requirements and constraints. From the versatility of Python for data science to the performance of Rust for system programming, the polyglot approach underscores the importance of selecting the right tool for the job, promoting interoperability and scalability.
Future Trajectories and Language Design Challenges
Looking ahead, Lakhanpal anticipates continued innovation and diversification in programming languages. With the proliferation of emerging technologies such as quantum computing, artificial intelligence, and blockchain, new language paradigms and abstractions will inevitably emerge. However, he acknowledges the inherent challenges in language design, including balancing simplicity with expressiveness, managing complexity, and addressing the evolving demands of modern computing environments.
Conclusion
Rajeev Lakhanpal's perspective on the evolution of computer programming languages provides a comprehensive framework for understanding the dynamic interplay between technological advancements, language design principles, and developer preferences. As we navigate an ever-changing landscape of software development, his insights serve as a guiding beacon, illuminating the past, present, and future trajectories of programming language evolution.
1 note
¡
View note
Text
Reprogramming the Future: How AI is Redefining Developers and Languages
New Post has been published on https://thedigitalinsider.com/reprogramming-the-future-how-ai-is-redefining-developers-and-languages/
Reprogramming the Future: How AI is Redefining Developers and Languages
The era of AI-powered programming is upon us, and itâs not just a supporting act; itâs stealing the limelight. AI is already rewriting the rules of code creation. However, this is just the tip of the iceberg when it comes to its potential. In the not-so-distant future, algorithms are poised to eliminate language barriers and radically transform the role of human developers. So, are we witnessing the end of the human programmer as we know it? Letâs find out.
AIâs Impact: Progress and Challenges
The CEO of Stability AI paints a dark picture for programmers, boldly claiming that artificial intelligence will replace them within just five years. OpenAI is going all-in, assembling an âarmyâ of external contractors to supercharge their model training, potentially obliterating entry-level coding jobs. Bloomberg ominously declares that Indiaâs massive pool of 5 million coders is on the brink of an AI jobpocalypse. Despite these dire forecasts, discussions on Reddit suggest that many programmers are nonchalant about their job security. But can we afford to remain so presumptuous in the face of such a radical shift?
If you think AI is just a sideshow, perhaps you should reconsider. Itâs true that right now, though AI can mimic the syntax and structure of human-written output, it often struggles to comprehend the âwhyâ behind the âwhat.â In other words, it lacks a deep understanding of the underlying logic and intent.Â
Still, already a staggering 92% of US-based developers are embracing AI coding tools, both at work and in their free time. These intelligent algorithms can whip up 40% of your code, from simple scripts to complex ones. Human error is becoming a thing of the past. Development speed is turbocharged, with AI slashing code documentation time by 45-50% and reducing code writing time by 35-45%.
AIâs reach isnât limited to a single language; it spans them all. Our own data shows that Java, Python, and C++ developers benefit equally from Machinetâs AI chat feature, which can generate code by using the context of a particular project and a description provided. This inclusivity leads to a 25% boost in user engagement.Â
But letâs not stop there â AI already exposes bugs in applications, ensuring that products are rock-solid, reliable, and robust. Neural networks can scan tirelessly for vulnerabilities that humans might miss. AI is honing its skills to identify softwareâs soft spots and boost its defenses, bringing us one step closer to a future where human oversight might become obsolete.
AIâs algorithms are even mastering the art of code translation. AI is like a polyglot programmer that analyzes code written in one language, then creates an equivalent version in another. Examples are already there â IBM has recently unveiled its assistant, which uses an AI model to translate COBOL into Java. The question is, who needs human experts or multiple programming languages when AI will finally be able to do it all?
The End of Language Diversity
I am confident that thereâs no stopping the rise of Large Language Models like GPT-4. They understand both natural language and code, blurring the boundaries like never before.Â
AI takeover raises questions about the future of the programming landscape. Today, hundreds of programming languages exist, and new ones are developed regularly. Several are actively used in the industry. According to the PYPL Index, Python is the most popular language worldwide, followed by Java, JavaScript, C# and C/C++. Other data shows that as of 2022, JavaScript was the most common among software developers. Some languages are suitable for similar purposes and applications, Java and GO being one example.
So, will these languages, each with its own niche and purpose, become useless as AI grows increasingly proficient at coding? I believe that AI is on the verge of rendering older, slower, and less secure technologies obsolete. This could potentially lead to a centralization of languages, with only the fastest and most efficient ones enduring. Developers will no longer choose them based on personal preferences or historical codebases. Instead, they will be selected for their performance. AI-driven tools will meticulously analyze and benchmark them to identify the optimal choices for specific tasks. These analyses will take into account factors such as execution speed, memory usage, and scalability.
A central, AI-friendly language for general coding tasks may even emerge. Still, a few specialized ones will have their place in niche domains, such as scientific computing. AI can facilitate their integration when specific problems require their usage. This hybrid approach will combine the efficiency of centralization with the power of specialization, offering flexibility and diversity in the development process.
Legacy Systems in the Crosshairs
AIâs influence extends beyond the creation of new code; it is also a potential legacy-killer. Migration from outdated languages to newer, more efficient ones can be a cumbersome and costly process. Yet, holding onto legacy systems is also a financial burden. Typically, technology teams allocate around 75% of their development budget to maintenance tasks. And if an organization continues to rely on legacy solutions, they can anticipate an annual budget increase of approximately 15%.
This is where AI-driven migration tools step in. They will make it easier for organizations to update their existing software to the optimal languages of this new era. AI-powered products will automatically analyze and understand the intricacies of outdated codebases. They will identify the core functionality, dependencies, and potential issues within the legacy code, making it far easier to plan and execute the migration process.Â
I even expect AI to identify the most suitable language for a given project and automatically convert the codebase, rewriting sections to adhere to best practices, eliminating redundant or deprecated functions, and optimizing the result for improved performance and security. Like this, AI-driven migration tools will gradually make legacy code a relic of the past.
Will Human Programmers Survive the Revolution?
Eventually, in this AI-dominated landscape, the role of human programmers will transform. Instead of writing code manually, they will bridge the gap between business needs and AI capabilities. They will define objectives, provide feedback, and ensure that the code aligns with their vision. In essence, developers will become âconnectorsâ with basic programming knowledge. At the same time, I can see AI coding assistants evolving into holistic solutions featuring user-friendly interfaces that empower people to effectively communicate their needs to algorithms.
These changes are going to democratize the field of programming. Currently, there are over 26 million software developers worldwide. The advancements in AI are paving the way for billions of people to step into the role of software creators. They will be able to request algorithms to craft tailored applications, be it games or corporate programs. Think about creating a new version of Angry Birds featuring cats? Simply explain your ideas to AI systems and obtain immediate results, without needing to understand how exactly this black box works.Â
In this context, a pressing question arises: what lies in store for junior and mid-level developers within this emerging paradigm? In my view, not much. AI is poised to outperform them significantly in every aspect. They might find themselves becoming AI supervisors or independently honing their skills, perhaps by engaging in less financially rewarding projects, to attain the proficiency level of well-qualified and high-paid programmers.Â
The latter group will remain in demand in sectors where errors are costly, and a 5% improvement in accuracy can translate into millions or even billions of savings. These are, for example, high-frequency trading, where a mere 10-millisecond variance can determine profit or loss, banking, and military technology programming.
This shift will create a genuine global competition among programmers. Currently, it operates within a somewhat pseudo-global framework. Unlike musicians competing on platforms like Spotify with peers from across the globe, developers can still primarily focus on local markets and specific tasks. However, the market where AI can manage a substantial share of programming tasks will become hardcore. Being âgood enoughâ will no longer suffice. Programmers will need to strive for excellence to compete with both peers worldwide and AI.
#2022#ai#ai model#AI systems#AI-powered#Algorithms#applications#approach#Art#artificial#Artificial Intelligence#banking#benchmark#birds#black box#box#bridge#bugs#Business#cats#CEO#code#code documentation#codebase#coding#computing#craft#creators#Dark#data
1 note
¡
View note
Text
Do you trust them not to steal the data, given how at least one of the hackers he hired has a history of working with cyber criminals and another was fired from a company because he leaked information?
Do you think people claiming to be so incompetent at their job that they lied and are still lying that COBOL error messages are somehow proof of massive fraud on a large scale to update a program written in COBOL?
Do you trust them not to completely fuck up the new website either through incompetence or on purpose as a way to steal people's benefits, maybe declare people dead or delete them for "fraud" if they don't like their last names or where they live?
Do you think using AI in code that is vital to the survival of so many Americans is a good Idea?
From the article:
"The DOGE team has already been reportedly running highly sensitive government data through AI, as the Washington Post reported last month, so why not use it to cheat-code your way to a more modern programming language? The reason, of course is the risk of cascading failures during any rush-job that might mean missed payments or beneficiary information getting wiped from the system entirely."
This is utterly terrifying, especially given the fact that they've already completely funked up Social Security phone service. How do I know? Just over a month ago, I called to do the quick phone tree to get a proof of income from them, something I have to do multiple times a year because various programs want them and they need to be very recent. The phone tree had been noticeably improved since last time I'd used it in the fall. When I called today 3/31/25, they had completely removed all the quick phone tree options.
They took a service that was completely automated in the last ten years, and thus super cheap and already in place, for people with a bunch of routine, common, queries and yanked all that out, requiring people to get in line for a live person. Last time I needed live agent service it took about five hours to get back to me.
They are lying that this is about efficiency and saving money. Leaving the automated system in place is dramatically cheaper than paying people to answer, especially at a time they are firing people.
This is meant to break the system and force the people who need their benefits the most out of the system.
Musk has given the goal of stealing Social Security benefits away from people who earned the benefit and actually need it:
"âIn fact, what weâre doing will help their benefits,â Musk said. âLegitimate people, as a result of the work of DOGE, will receive more Social Security, not less. I want to emphasize that. As a result of the work of DOGE, legitimate recipients of Social Security will receive more money, not less money.â"
The only way that happens is to take it away from the majority of recipients. You know the people Lutnick claims are fraudsters if they complain at the theft of their rent and electricity bill money recently.
Have something you want to tell your Congress Critters?
If you can't safely contact them in person, here are some other options:
Five Calls to your critters: https://5calls.org/
Here is one that will send your reps a fax: https://resist.bot/
Scream loudest at republican Critters. Those are their voters Musk is trying to kill, but whatever critters you have, stay noisy. We have until 4/14 to stop them.
15 notes
¡
View notes
Text
Follow-up to my latest brain-dump, aka the suggestions themselves?
I mean, it all starts with a healthy and rewarding routine with good habits to be acquired. (Which does mean getting outdoors walking, being mindful and divide to conquer my time to reach goals of mine;)
And I will get there in due time. Anyhow, here are goals to look forward to.
Level design for Doom, Duke Nukem 3D, Quake, Red Eclipse, Counter-Strike 2...
Linux Artix ecosystem with Fish shell and KDE Liquid desktop environment
RISC-V & OpenPOWER ISA
Historical and sidestream data & hardware experience preservation
Library economy stuff...
FORTRAN, COBOL, REXX, Assemblers programming... (in strong demand and I wanted to dip my toes there some) for mainframes and whatnot...
Steel Bank Common Lisp (and other Lisp dialects)
Nim, Zig, C (2023 revision), Crystal...
Godot, GDScript, Qodot+SimpleFPC...
Raymarching and Compute Shaders?
MegaOCEAN NPCs and QuestMachine Systemic Design from PixelCrushers
Benevolent Social Ecosystem Design...
Indie storytelling + manifestation toybox development...
EVALDRAW / BUILD2 (overall Ken Silverman stuff)
Creative storytelling and descriptive workflow for multimedia cute toons
Motion Canvas, data vizualization & mathematics...
Whatever cool miniature projects I want to dive into
And ofc existing purchased courseware I have yet to finish...
0 notes
Quote
The traditional bread and butter of a mainframe are COBOL programs, data files, and JCL, or Job Control Language. Most companies run large COBOL programs developed over the decades, and programs are run and data is processed in non-interactive jobs. The concept of batch computer jobs goes back to the '50s and '60s and predates the era of interactive, timeshared minicomputers. A basic job consists of the computer code to be run, the input data to be processed, an understanding of how to process the output data, and job commands that communicate the processing steps to the Z/OS operating system. Large amounts of data can be processed very efficiently in non-interactive batches. Below are some examples of JCL and COBOL code.
The IBM mainframe: How it runs and why it survives
0 notes
Text
Many programmers were slow to move from COBOL or Fortran due to a perceived complexity of the language and immaturity of the PL/I F compiler. Programmers were sharply divided into scientific programmers (who used Fortran) and business programmers (who used COBOL), with significant tension and even dislike between the groups. PL/I syntax borrowed from both COBOL and Fortran syntax. So instead of noticing features that would make their job easier, Fortran programmers of the time noticed COBOL syntax and had the opinion that it was a business language, while COBOL programmers noticed Fortran syntax and looked upon it as a scientific language.
Oh my let's take a look at the sample program section and see what this looks like--

You know, as something of a COBOL hater myself,
An early reference to the âLaw of Least Astonishmentâ appeared in the PL/I Bulletin in 1967 (PL/I is a programming language released by IBM in 1966). By the late 1960s, PL/I had become infamous for violating the law, for example because, due to PL/I's precision conversion rules, the expressions â25 + 1/3â and â1/3 + 25â would either produce a fatal error, or, if errors were suppressed, 5.33333333333 instead of the correct 25.33333333333.
astonishing
332 notes
¡
View notes