#API bridges
Explore tagged Tumblr posts
Text
API Bridges Work in Algo Trading
Tumblr media
API Bridges are a crucial part of algorithmic trading, which allows trading platforms, brokers, and custom trading algorithms to work seamlessly together. They provide real-time data transfer and order execution, thus making the trading strategy more efficient, faster, and accurate. In this article, we will explain how API bridges work in algo trading and further explore their importance for traders and developers, especially in India.
What is algorithmic trading? Algorithmic trading is the use of computer algorithms to automatically execute trades based on pre-defined criteria such as market conditions, technical indicators, or price movements. Unlike manual trading, algorithmic trading allows traders to make faster decisions and execute multiple orders simultaneously, minimizing human error and maximizing potential profits.
Understanding API Bridges in Algo Trading API bridges are the connector layer between different software platforms through which they can communicate with each other. In algo trading, an API bridge is used to bridge your trading algorithm running from platforms like Amibroker, MetaTrader 4/5, or TradingView to the broker's trading system for automated execution of orders.
Important Functions of API Bridges in Algorithmic Trading Data Feed Integration: API bridges enable direct access to live market data by the algo trader, such as current stock prices, volumes, and order books, from the broker's system. This will serve as the basis of information that the algorithm should interpret for better decision-making. Once the algorithm determines a suitable trading opportunity, the API bridge sends the buy or sell order directly to the broker’s trading system. This process is automated, ensuring timely execution without manual intervention.
Backtesting: API bridges enable traders to backtest their algorithms using historical data to evaluate performance before executing real trades. This feature is particularly useful for optimizing strategies and reducing risks.
Risk Management: An effective API bridge helps implement the risk management protocol in trading algorithms, for example, stop-loss or take-profit orders. When specific conditions are met, such orders are automatically entered to eliminate emotional decision-making and loss. Trade Monitoring: The API bridge continuously monitors trade execution with real-time updates on orders, positions, and account balances. The traders stay informed and make adjustments in their algorithms.
ALSO READ
Why API Bridges are the Need of Algo Trading? Speed and Efficiency: API bridges allow high-frequency trading (HFT), which enables traders to execute thousands of trades per second with minimal delay. This speed is very important in fast-moving markets where timing is everything to profitability.
Customization: With custom-built algorithms interacting with a multitude of brokers through the API bridge, traders can personalize their strategies, thus being able to implement advanced trading strategies that otherwise would not be possible to manually implement.
Integration is smooth. API bridges enable traders to connect their favorite platforms, such as Amibroker or TradingView, with brokers like Angel One, Alice Blue, or Zerodha. In other words, traders can continue using the software they are familiar with while availing of the execution capabilities of the broker's platform.
Cost-Effective: In comparison to hiring a dedicated team of traders or using expensive proprietary systems, API bridges are more cost-effective for algo traders. They allow traders to use the power of automation without the high overhead costs.   Improved Risk Management: By automating risk controls, such as setting limits for loss and profits, the algorithmic system ensures that the trades are executed with minimal risk, thus helping traders in India and worldwide to manage the risk exposure better.
API Bridges Working with Popular Trading Platforms Amibroker: Amibroker is a more popular software used by algo traders for technical analysis and backtesting. The integration of Amibroker with API bridge enables traders to execute a strategy in real-time against their preferred broker's interface, which enriches trading experience.
MetaTrader MT4/MT5: MetaTrader is also a widely used platform for algorithmic trading. Through an API bridge, traders can link their trading robots (Expert Advisors) to brokers supporting the MT4 or MT5 platforms to automatically execute trades based on their algorithms.
TradingView: The most renowned trading view is a charting platform famous for its user-friendly interface and powerful scripting language called Pine Script. With an API bridge, users can send real-time trading signals to their brokers for the broker to execute.
The Best API Bridges for Algo Trading in India are by Combiz Services Pvt. Ltd.: Combiz Services Pvt. Ltd. provides customized API solutions that ensure seamless integration between brokers and trading platforms. Their API bridges support a wide range of trading platforms such as Amibroker, MetaTrader, and TradingView, which makes it a good option for Indian traders seeking flexibility and speed in algorithmic trading.
AlgoTrader: AlgoTrader provides an advanced algorithmic trading platform that supports integration with various brokers through API bridges. It is known for its scalability and high-speed trading capabilities, making it a favorite among professional traders.
Interactive Brokers API: Interactive Brokers offers a robust API that allows traders to link their algorithms directly to their trading platform. With a rich set of features such as market data feeds and execution capabilities, the Interactive Brokers' API bridge is highly regarded by the algo traders.
How to Set Up an API Bridge for Algo Trading
Select a Trading Platform and Broker: You may select Amibroker or MetaTrader as the trading platform. Then, go for a broker who gives access to APIs, such as Zerodha or Alice Blue. Connect API: Once you have made a selection of the above-mentioned platforms and broker, you must connect the API bridge with your algorithm in relation to the broker's system. In this step, generally, it involves configuration settings and keys of the APIs. Create or Select Algorithm: If you are a new algo trader, you can make use of pre-built strategies or create your own using programming languages like Python or AFL (AmiBroker Formula Language).
Backtest and test the algorithm: Before you deploy the algorithm, backtest it with historical data to ensure it performs as expected.
Monitor and Adjust: After you have deployed the algorithm, monitor its performance and make adjustments according to the changing market conditions.
Conclusion API bridges are a must-have tool in the world of algorithmic trading, providing smooth integration, faster execution, and improved risk management. Using Amibroker, MetaTrader, or TradingView platforms, API bridges make sure that your trading strategy is executed efficiently and effectively. The power of API bridges enables traders to stay ahead in the competitive world of algo trading and maximize opportunities in the Indian stock market.
For someone seeking a robust and highly customizable solution for algo trading needs, Combiz Services Pvt. Ltd. has the best API bridge services that guarantee seamless integration and faster trade execution.
0 notes
jcmarchi · 11 days ago
Text
Magistral: Mistral AI challenges big tech with reasoning model
New Post has been published on https://thedigitalinsider.com/magistral-mistral-ai-challenges-big-tech-with-reasoning-model/
Magistral: Mistral AI challenges big tech with reasoning model
Mistral AI has pulled back the curtain on Magistral, their first model specifically built for reasoning tasks.
Magistral arrives in two flavours: a 24B parameter open-source version called Magistral Small that anyone can tinker with, and a beefier enterprise edition, Magistral Medium, aimed at commercial applications where advanced reasoning capabilities matter most.
“The best human thinking isn’t linear—it weaves through logic, insight, uncertainty, and discovery,” explains Mistral AI.
That’s a fair point, existing models often struggle with the messy, non-linear way humans actually think through problems. I’ve tested numerous reasoning models and they typically suffer from three key limitations: they lack depth in specialised domains, their thinking process is frustratingly opaque, and they perform inconsistently across different languages.
Mistral AI’s real-world reasoning for professionals
For professionals who’ve been hesitant to trust AI with complex tasks, Magistral might change some minds.
Legal eagles, finance folks, healthcare professionals and government workers will appreciate the model’s ability to show its work. All conclusions can be traced back through logical steps—crucial when you’re operating in regulated environments where “because the AI said so” simply doesn’t cut it.
Software developers haven’t been forgotten either. Magistral claims to shine at the kind of structured thinking that makes for better project planning, architecture design, and data engineering. Having struggled with some models that produce plausible-sounding but flawed technical solutions, I’m keen to see if Magistral’s reasoning capabilities deliver on this front.
Mistral claims their reasoning model excels at creative tasks too. The company reports that Magistral is “an excellent creative companion” for writing and storytelling, capable of producing both coherent narratives and – when called for – more experimental content. This versatility suggests we’re moving beyond the era of having separate models for creative versus logical tasks.
What separates Magistral from the rest?
What separates Magistral from run-of-the-mill language models is transparency. Rather than simply spitting out answers from a black box, it reveals its thinking process in a way users can follow and verify.
This matters enormously in professional contexts. A lawyer doesn’t just want a contract clause suggestion; they need to understand the legal reasoning behind it. A doctor can’t blindly trust a diagnostic suggestion without seeing the clinical logic. By making its reasoning traceable, Magistral could help bridge the trust gap that’s held back AI adoption in high-stakes fields.
Having spoken with non-English AI developers, I’ve heard consistent frustration about how reasoning capabilities drop off dramatically outside English. Magistral appears to tackle this head-on with robust multilingual support, allowing professionals to reason in their preferred language without performance penalties.
This isn’t just about convenience; it’s about equity and access. As countries increasingly implement AI regulations requiring localised solutions, tools that reason effectively across languages will have a significant advantage over English-centric competitors.
[embedded content]
Getting your hands on Magistral
For those wanting to experiment, Magistral Small is available now under the Apache 2.0 licence via Hugging Face. Those interested in the more powerful Medium version can test a preview through Mistral’s Le Chat interface or via their API platform.
[embedded content]
Enterprise users looking for deployment options can find Magistral Medium on Amazon SageMaker, with IBM WatsonX, Azure, and Google Cloud Marketplace implementations coming soon.
As the initial excitement around general-purpose chatbots begins to wane, the market is hungry for specialised AI tools that excel at specific professional tasks. By focusing on transparent reasoning for domain experts, Mistral has carved out a potentially valuable niche.
Founded just last year by alumni from DeepMind and Meta AI, Mistral has moved at breakneck speed to establish itself as Europe’s AI champion. They’ve consistently punched above their weight, creating models that compete with offerings from companies many times their size.
As organisations increasingly demand AI that can explain itself – particularly in Europe where the AI Act will require transparency – Magistral’s focus on showing its reasoning process feels particularly timely.
(Image by Stephane)
See also: Tackling hallucinations: MIT spinout teaches AI to admit when it’s clueless
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.
0 notes
winklix · 15 days ago
Text
How Custom APIs Can Bridge Your Old Systems with Modern Tools
Tumblr media
In an era of rapid digital transformation, companies face mounting pressure to stay agile, integrated, and scalable. Yet, for many businesses, legacy systems—software built decades ago—remain deeply embedded in their day-to-day operations. While these systems often house critical data and workflows, they are frequently incompatible with modern tools, cloud services, and mobile apps.
This is where custom APIs (Application Programming Interfaces) come in. Rather than rip and replace your existing systems, custom APIs offer a powerful and cost-effective way to connect legacy infrastructure with modern applications, extending the life and value of your current technology stack.
In this blog, we explore how custom APIs bridge the gap between the old and new, and why partnering with a custom software development company in New York is the key to unlocking this integration.
Understanding the Legacy Dilemma
Legacy systems are often mission-critical, but they can be:
Built on outdated technologies
Poorly documented
Incompatible with cloud-native tools
Rigid and difficult to scale
Lacking modern security standards
Despite their limitations, many organizations can’t afford to replace these systems outright. The risks of downtime, data loss, and cost overruns are simply too high. The solution? Wrap them in custom APIs to make them work like modern systems—without the need for complete redevelopment.
What Are Custom APIs?
A custom API is a tailored interface that allows software systems—regardless of their age or architecture—to communicate with each other. APIs translate data and commands into a format that both the old system and the new tools can understand.
With a well-designed API, you can:
Expose data from a legacy database to a web or mobile app
Allow cloud tools like Salesforce, Slack, or HubSpot to interact with your internal systems
Enable real-time data sharing across departments and platforms
Automate manual workflows that previously required human intervention
By working with a software development company in New York, you can create secure, scalable APIs that connect your most valuable legacy systems with cutting-edge technologies.
Why Modern Businesses Need Custom APIs
1. Avoid Expensive System Replacements
Rebuilding or replacing legacy systems is expensive, time-consuming, and risky. Custom APIs offer a lower-cost, lower-risk alternative by extending system capabilities without touching the core.
Companies that work with custom software development companies in New York can protect their existing investments while gaining modern functionality.
2. Enable Integration with Cloud and SaaS Platforms
Most modern SaaS platforms rely on APIs for connectivity. Custom APIs allow legacy systems to seamlessly:
Push data into cloud CRMs or ERPs
Trigger automated workflows via platforms like Zapier or Make
Sync with eCommerce, finance, or HR software
This means your old software can now "talk" to cutting-edge services—without needing a full migration.
3. Boost Business Agility
In today's fast-paced environment, agility is a competitive advantage. APIs let you innovate quickly by connecting existing systems to new apps, dashboards, or devices. This flexibility makes it easier to adapt to new customer demands or market trends.
That’s why the top software development companies in New York prioritize API-first development in digital transformation strategies.
Real-World Use Cases: Bridging Old and New with APIs
1. Legacy ERP + Modern BI Tools
A manufacturing firm uses a 20-year-old ERP system but wants modern analytics. A custom software development company in New York creates APIs that extract relevant data and feed it into Power BI or Tableau for real-time dashboards—no need to rebuild the ERP.
2. Mainframe + Mobile Application
A financial services provider wants to offer customers a mobile app without replacing its mainframe. A custom API acts as the bridge, securely transmitting data between the mobile frontend and the back-end system.
3. CRM Integration
A healthcare provider uses a legacy patient database but wants to sync data with Salesforce. Custom APIs enable bidirectional integration, allowing staff to use Salesforce without duplicating data entry.
Key Features of a High-Quality Custom API
When you work with a custom software development company in New York, you can expect your API to be:
1. Secure
APIs are a potential attack vector. A quality solution includes:
Token-based authentication (OAuth 2.0)
Role-based access control
Encryption in transit and at rest
Audit logs and usage tracking
2. Scalable
Your API must be able to handle increased traffic as your business grows. Proper load balancing, caching, and rate limiting ensure performance under load.
3. Well-Documented
Clear documentation ensures your development teams and third-party vendors can integrate efficiently. OpenAPI/Swagger standards are often used for this purpose.
4. Versioned and Maintainable
APIs must be version-controlled to allow for upgrades without breaking existing integrations. A skilled software development company in New York will build a future-proof architecture.
Custom APIs vs Middleware: What’s the Difference?
Some businesses confuse APIs with middleware. While both can connect systems:
Middleware is a more generalized layer that handles communication between systems
Custom APIs are specifically designed for a particular integration or business need
Custom APIs are more flexible, lightweight, and tailored to your workflow—making them ideal for companies that need precise, scalable connections.
Why Work with a Custom Software Development Company in New York?
Building effective APIs requires a deep understanding of both legacy systems and modern tech stacks. Here’s why businesses trust custom software development companies in New York:
Local knowledge of regulatory requirements, compliance, and market needs
Time zone alignment for seamless collaboration
Experience with complex integrations across industries like finance, healthcare, logistics, and retail
Access to top-tier engineering talent for scalable, secure API architecture
Partnering with the best software development company in New York ensures your integration project is delivered on time, within budget, and aligned with your long-term business goals.
The Role of API Management Platforms
For businesses dealing with multiple APIs, platforms like MuleSoft, Postman, or Azure API Management provide centralized control. A top software development company in New York can help you:
Monitor performance and uptime
Control access and permissions
Analyze usage metrics
Create scalable microservices environments
Conclusion: Modernization Without Disruption
Legacy systems don’t have to hold you back. With the right strategy, they can be revitalized and connected to the tools and platforms that power modern business. Custom APIs act as bridges, allowing you to preserve what works while embracing what’s next.
For businesses in New York looking to modernize without massive overhauls, partnering with a trusted custom software development company in New York like Codezix is the smartest move. From planning and development to deployment and support, Codezix builds the API solutions that connect your old systems with tomorrow’s innovation.
Looking to integrate your legacy systems with modern platforms? Let Codezix, a top software development company in New York, create powerful custom APIs that enable secure, scalable transformation—without starting from scratch.
Know morehttps://winklix.wixsite.com/winklix/single-post/custom-software-vs-saas-which-is-more-secure
0 notes
somepikminpostcards · 27 days ago
Text
Tumblr media
0 notes
Text
Best MetaTrader API Bridge Solutions for Copy Trading
Copy trading has revolutionized the financial markets, making it possible for traders to mimic the trades of experienced professionals. Among the most effective ways through which one can perform copy trading is by the use of a MetaTrader API Bridge, which connects trading platforms with other external systems for the smooth execution of trade. In this article, we shall seek to explore the best MetaTrader API Bridge solutions for copy trading, the benefits, and how they enhance trading efficiency. A MetaTrader API Bridge is a kind of software that connects MetaTrader 4 or MetaTrader 5 with the outside trading system, liquidity providers, or multi-accounts seamlessly. This makes it possible to automate trade executions, replicate strategies between accounts, and enhance trading performance in general. An API bridge is required for copy trading as it will ensure real-time trade synchronization, low-latency execution, and efficient trade distribution across multiple accounts. Advantages of MetaTrader API Bridge for Copy Trading 1. Instant Execution of Traded Orders Trades get executed in real-time across connected accounts with MetaTrader API Bridge. This guarantees no delay which could affect profit generation. 2. Multi-Account Trading The bridge enables the management of multiple accounts at the same time, making it suitable for professional traders and portfolio managers. 3. Automated Copy Trading API bridges enable automated trade replication, thereby eliminating the need for manual interventions. 4. Reduced Latency Latency is a significant issue in copy trading. High-speed API bridges minimize execution delays, ensuring that copied trades are placed at the best possible prices. 5. Customizable Trade Allocation Traders can use the trade allocation according to their account size, risk preferences, or predefined strategies. 6. Enhanced Risk Management Advanced risk management settings on the MT5 API Bridge can be used for stop-loss and take-profit protective measures for the copied trades. 7. Easy Integration with Algo Trading Traders that use algorithmic trading strategies can integrate their MT5 API Bridge with pre-configured bots that can automatically carry out optimized trades. Best MetaTrader API Bridge for Copy Trading 1. Combiz Services API Bridge Combiz Services API Bridge is one of the most reliable solutions, providing fast trade execution, real-time synchronization, and advanced copy trading features. It is perfect for traders looking for a robust and scalable solution. 2. Trade Copier for MT4 & MT5 This solution enables instant trade replication between multiple MetaTrader accounts, making it ideal for those traders who manage several clients. 3. One-Click Copy Trading API This bridge provides a friendly user interface and one-click trade copying for the comfort of beginner and professional traders. 4. FX Blue Personal Trade Copier A very powerful free tool, it can copy trades between accounts with very low latency, allowing customizable settings of trade copy parameters. 5. Social Trading Platforms with API Bridges Many social trading networks provide built-in API bridge functionality, enabling users to copy trades from expert traders directly to their MetaTrader accounts. How to Choose the Right MetaTrader API Bridge for Copy Trading When selecting an API bridge for copy trading, consider the following factors: Latency and Execution Speed: Choose a bridge that offers fast trade execution with minimal slippage. Multi-Account Compatibility: Ensure that the solution supports multiple trading accounts. Risk Management Features: This is a feature of a good bridge that allows having the stop-loss, take-profit, and risk allocation settings. Customization and Automation: The ultimate choice should allow support for custom trading strategies and algo trading integrations. Reliability and Support: A bridge must have a good track record with reliable customer support.
0 notes
beeingapis · 11 months ago
Text
Tumblr media
"Hive with Care"
0 notes
canmom · 4 months ago
Text
using LLMs to control a game character's dialogue seems an obvious use for the technology. and indeed people have tried, for example nVidia made a demo where the player interacts with AI-voiced NPCs:
youtube
this looks bad, right? like idk about you but I am not raring to play a game with LLM bots instead of human-scripted characters. they don't seem to have anything interesting to say that a normal NPC wouldn't, and the acting is super wooden.
so, the attempts to do this so far that I've seen have some pretty obvious faults:
relying on external API calls to process the data (expensive!)
presumably relying on generic 'you are xyz' prompt engineering to try to get a model to respond 'in character', resulting in bland, flavourless output
limited connection between game state and model state (you would need to translate the relevant game state into a text prompt)
responding to freeform input, models may not be very good at staying 'in character', with the default 'chatbot' persona emerging unexpectedly. or they might just make uncreative choices in general.
AI voice generation, while it's moved very fast in the last couple years, is still very poor at 'acting', producing very flat, emotionless performances, or uncanny mismatches of tone, inflection, etc.
although the model may generate contextually appropriate dialogue, it is difficult to link that back to the behaviour of characters in game
so how could we do better?
the first one could be solved by running LLMs locally on the user's hardware. that has some obvious drawbacks: running on the user's GPU means the LLM is competing with the game's graphics, meaning both must be more limited. ideally you would spread the LLM processing over multiple frames, but you still are limited by available VRAM, which is contested by the game's texture data and so on, and LLMs are very thirsty for VRAM. still, imo this is way more promising than having to talk to the internet and pay for compute time to get your NPC's dialogue lmao
second one might be improved by using a tool like control vectors to more granularly and consistently shape the tone of the output. I heard about this technique today (thanks @cherrvak)
third one is an interesting challenge - but perhaps a control-vector approach could also be relevant here? if you could figure out how a description of some relevant piece of game state affects the processing of the model, you could then apply that as a control vector when generating output. so the bridge between the game state and the LLM would be a set of weights for control vectors that are applied during generation.
this one is probably something where finetuning the model, and using control vectors to maintain a consistent 'pressure' to act a certain way even as the context window gets longer, could help a lot.
probably the vocal performance problem will improve in the next generation of voice generators, I'm certainly not solving it. a purely text-based game would avoid the problem entirely of course.
this one is tricky. perhaps the model could be taught to generate a description of a plan or intention, but linking that back to commands to perform by traditional agentic game 'AI' is not trivial. ideally, if there are various high-level commands that a game character might want to perform (like 'navigate to a specific location' or 'target an enemy') that are usually selected using some other kind of algorithm like weighted utilities, you could train the model to generate tokens that correspond to those actions and then feed them back in to the 'bot' side? I'm sure people have tried this kind of thing in robotics. you could just have the LLM stuff go 'one way', and rely on traditional game AI for everything besides dialogue, but it would be interesting to complete that feedback loop.
I doubt I'll be using this anytime soon (models are just too demanding to run on anything but a high-end PC, which is too niche, and I'll need to spend time playing with these models to determine if these ideas are even feasible), but maybe something to come back to in the future. first step is to figure out how to drive the control-vector thing locally.
48 notes · View notes
watchnrant · 7 months ago
Text
Interior Chinatown: A Sharp Satire That Challenges Stereotypes and Forces Self-Reflection
Tumblr media
Interior Chinatown is a brilliant yet understated reflection of the world—a mirror that exposes how society often judges people by their covers. The show captures this poignantly with the scene where Willis Wu can’t get into the police precinct until he proves his worth by delivering food. It’s a powerful metaphor: sometimes, if you don’t fit the mold, you have to prove your value in the most degrading or unexpected ways just to get a foot in the door. The locked precinct doors represent barriers faced by those who don’t match the “majority’s” idea of what’s acceptable or valuable.
While the series centers on the Asian and Pacific Islander (API) community and the stereotypical roles Hollywood has long relegated them to—background extras, kung fu fighters—it forces viewers to confront bigger questions. It makes you ask: Am I complicit in perpetuating these stereotypes? Am I limiting others—or even myself—by what I assume is their worth? It’s not just about API representation; it’s about how society as a whole undervalues anyone who doesn’t fit neatly into its preferred narrative.
The show can feel confusing if you don’t grasp its satirical lens upfront. But for me, knowing the context of Charles Yu’s original book helped it click. The production team does an incredible job balancing satire with sincerity, blurring the line between real life and the exaggerated Hollywood “procedural” format. They cleverly use contrasting visuals and distinct camera work to draw you into different headspaces—Hollywood’s glossy expectations versus the grittier reality of life.
Chloe Bennet’s involvement (real name Chloe Wang) ties into the show’s themes on a deeply personal level. She famously changed her last name to navigate Hollywood, caught in the impossible middle ground of not being “Asian enough” or “white enough” for casting directors. It’s a decision that sparks debate—was it an act of survival, assimilation, or betrayal? But for Bennett, it was about carving a space for herself to pursue her dreams.
This theme echoes in one of the show’s most poignant scenes, where Lana is told, “You will never completely understand. You’re mixed.” It’s a crushing acknowledgment of the barriers that persist, even when you’re trying to bridge divides. Lana’s story highlights how identity can be both a strength and an obstacle, and the line serves as a painful reminder of the walls society creates—externally and internally.
Interior Chinatown doesn’t just ask us to look at the system; it forces us to examine ourselves. Whether it’s Willis Wu at the precinct door or Lana trying to connect in a world that sees her as neither this nor that, the show unflinchingly portrays the struggle to belong. And as viewers, it challenges us to question our role in those struggles: Are we helping to dismantle the barriers, or are we quietly reinforcing them?
52 notes · View notes
mariacallous · 3 months ago
Text
Elon Musk’s so-called Department of Government Efficiency (DOGE) has plans to stage a “hackathon” next week in Washington, DC. The goal is to create a single “mega API”—a bridge that lets software systems talk to one another—for accessing IRS data, sources tell WIRED. The agency is expected to partner with a third-party vendor to manage certain aspects of the data project. Palantir, a software company cofounded by billionaire and Musk associate Peter Thiel, has been brought up consistently by DOGE representatives as a possible candidate, sources tell WIRED.
Two top DOGE operatives at the IRS, Sam Corcos and Gavin Kliger, are helping to orchestrate the hackathon, sources tell WIRED. Corcos is a health-tech CEO with ties to Musk’s SpaceX. Kliger attended UC Berkeley until 2020 and worked at the AI company Databricks before joining DOGE as a special adviser to the director at the Office of Personnel Management (OPM). Corcos is also a special adviser to Treasury Secretary Scott Bessent.
Since joining Musk’s DOGE, Corcos has told IRS workers that he wants to pause all engineering work and cancel current attempts to modernize the agency’s systems, according to sources with direct knowledge who spoke with WIRED. He has also spoken about some aspects of these cuts publicly: "We've so far stopped work and cut about $1.5 billion from the modernization budget. Mostly projects that were going to continue to put us down the death spiral of complexity in our code base," Corcos told Laura Ingraham on Fox News in March.
Corcos has discussed plans for DOGE to build “one new API to rule them all,” making IRS data more easily accessible for cloud platforms, sources say. APIs, or application programming interfaces, enable different applications to exchange data, and could be used to move IRS data into the cloud. The cloud platform could become the “read center of all IRS systems,” a source with direct knowledge tells WIRED, meaning anyone with access could view and possibly manipulate all IRS data in one place.
Over the last few weeks, DOGE has requested the names of the IRS’s best engineers from agency staffers. Next week, DOGE and IRS leadership are expected to host dozens of engineers in DC so they can begin “ripping up the old systems” and building the API, an IRS engineering source tells WIRED. The goal is to have this task completed within 30 days. Sources say there have been multiple discussions about involving third-party cloud and software providers like Palantir in the implementation.
Corcos and DOGE indicated to IRS employees that they intended to first apply the API to the agency’s mainframes and then move on to every other internal system. Initiating a plan like this would likely touch all data within the IRS, including taxpayer names, addresses, social security numbers, as well as tax return and employment data. Currently, the IRS runs on dozens of disparate systems housed in on-premises data centers and in the cloud that are purposefully compartmentalized. Accessing these systems requires special permissions and workers are typically only granted access on a need-to-know basis.
A “mega API” could potentially allow someone with access to export all IRS data to the systems of their choosing, including private entities. If that person also had access to other interoperable datasets at separate government agencies, they could compare them against IRS data for their own purposes.
“Schematizing this data and understanding it would take years,” an IRS source tells WIRED. “Just even thinking through the data would take a long time, because these people have no experience, not only in government, but in the IRS or with taxes or anything else.” (“There is a lot of stuff that I don't know that I am learning now,” Corcos tells Ingraham in the Fox interview. “I know a lot about software systems, that's why I was brought in.")
These systems have all gone through a tedious approval process to ensure the security of taxpayer data. Whatever may replace them would likely still need to be properly vetted, sources tell WIRED.
"It's basically an open door controlled by Musk for all American's most sensitive information with none of the rules that normally secure that data," an IRS worker alleges to WIRED.
The data consolidation effort aligns with President Donald Trump’s executive order from March 20, which directed agencies to eliminate information silos. While the order was purportedly aimed at fighting fraud and waste, it also could threaten privacy by consolidating personal data housed on different systems into a central repository, WIRED previously reported.
In a statement provided to WIRED on Saturday, a Treasury spokesperson said the department “is pleased to have gathered a team of long-time IRS engineers who have been identified as the most talented technical personnel. Through this coalition, they will streamline IRS systems to create the most efficient service for the American taxpayer. This week the team will be participating in the IRS Roadmapping Kickoff, a seminar of various strategy sessions, as they work diligently to create efficient systems. This new leadership and direction will maximize their capabilities and serve as the tech-enabled force multiplier that the IRS has needed for decades.”
Palantir, Sam Corcos, and Gavin Kliger did not immediately respond to requests for comment.
In February, a memo was drafted to provide Kliger with access to personal taxpayer data at the IRS, The Washington Post reported. Kliger was ultimately provided read-only access to anonymized tax data, similar to what academics use for research. Weeks later, Corcos arrived, demanding detailed taxpayer and vendor information as a means of combating fraud, according to the Post.
“The IRS has some pretty legacy infrastructure. It's actually very similar to what banks have been using. It's old mainframes running COBOL and Assembly and the challenge has been, how do we migrate that to a modern system?” Corcos told Ingraham in the same Fox News interview. Corcos said he plans to continue his work at IRS for a total of six months.
DOGE has already slashed and burned modernization projects at other agencies, replacing them with smaller teams and tighter timelines. At the Social Security Administration, DOGE representatives are planning to move all of the agency’s data off of legacy programming languages like COBOL and into something like Java, WIRED reported last week.
Last Friday, DOGE suddenly placed around 50 IRS technologists on administrative leave. On Thursday, even more technologists were cut, including the director of cybersecurity architecture and implementation, deputy chief information security officer, and acting director of security risk management. IRS’s chief technology officer, Kaschit Pandya, is one of the few technology officials left at the agency, sources say.
DOGE originally expected the API project to take a year, multiple IRS sources say, but that timeline has shortened dramatically down to a few weeks. “That is not only not technically possible, that's also not a reasonable idea, that will cripple the IRS,” an IRS employee source tells WIRED. “It will also potentially endanger filing season next year, because obviously all these other systems they’re pulling people away from are important.”
(Corcos also made it clear to IRS employees that he wanted to kill the agency’s Direct File program, the IRS’s recently released free tax-filing service.)
DOGE’s focus on obtaining and moving sensitive IRS data to a central viewing platform has spooked privacy and civil liberties experts.
“It’s hard to imagine more sensitive data than the financial information the IRS holds,” Evan Greer, director of Fight for the Future, a digital civil rights organization, tells WIRED.
Palantir received the highest FedRAMP approval this past December for its entire product suite, including Palantir Federal Cloud Service (PFCS) which provides a cloud environment for federal agencies to implement the company’s software platforms, like Gotham and Foundry. FedRAMP stands for Federal Risk and Authorization Management Program and assesses cloud products for security risks before governmental use.
“We love disruption and whatever is good for America will be good for Americans and very good for Palantir,” Palantir CEO Alex Karp said in a February earnings call. “Disruption at the end of the day exposes things that aren't working. There will be ups and downs. This is a revolution, some people are going to get their heads cut off.”
15 notes · View notes
edwardos · 19 days ago
Text
For a spooky true story post, here's something that's been happening with me and Minecraft for the last >3 years.
In 2021 I suddenly had the inspiration to try hosting a tiny Minecraft Beta 1.7.3 unmodded server for me and online friends, which developed into something great and turned out to be a really prosperous move, even though ever since and from the beginning it's all been played by just me and one other person. But it's also where I started to discover that I have problems with Minecraft usage memory loss.
This isn't the kind of thing that I'd call supernatural or something like that, but it does creep me out, because these occurrences badly go against what I feel should be normal.
And though somebody could just make all this up, I can't make it up because it all actually happened, and I'm not going to lie here about what I think I do and don't remember, either. Everything that I say happened here, to the best of my knowledge, actually happened in Minecraft in conjunction with real life.
Tumblr media
Starting with the original Monty Home - the name of this server at the time - in 2021, one day I started making tunnels in the ceiling of The Nether. They were narrow, cheap tunnels, and they had a practically-camouflaged stair entrance. So I didn't stumble upon them again for a long time.
I was attempting to find new interesting places in Beta terrain generation by making new portals Nether-side, yada yada yada, you know the drill.
I did come to a sense of normality with this one in the end, remembering that I had an old venture to bridge distant overworld chunks, but for a while, I was really stumped when I accidentally rediscovered these tunnels one day. I really did not remember that these strange tunnels were built by me. That day in which I found the tunnels again was in 2022. And here are the screenshots I took while I was doing that.
Tumblr media Tumblr media Tumblr media
Records in my communication channels say that this scared me and it was a real "What the fuck?" moment. And that I killed a Zombie Pigman while I was in there, which I guess now I realize I didn't have to do because I could have just tunneled around him...
The picture further up in the grassland area is what I believe to be one of the tunnel destinations, taken before I forgot the tunnels that I carved existed. At least the memory came back into focus after a while, though.
Not all of my memories of developing on places I altered in Minecraft have come back.
While checking old places in my records to get information to write this, I was reminded of the time a diamond pickaxe I made in Monty Home went missing. Same Beta 1.7 world as above. I told my friend it went missing and that I definitely made one, and he was stumped. I always privately suspected that he was screwing with me, but not anymore. Now I think my ability to remember erased the pickaxe's location. He told me he found this to be weird and creepy. He wondered if maybe it was a server glitch, but I bet it wasn't. I still don't know what happened to my diamond pickaxe. Nobody allegedly remembers where it went.
After Monty Home, the server was modded and became Monty Cabin. There was also an era of Monty Cabin where I played Better Than Adventure with my friend, which was fun, but has no significant events in today's recollection that are supposed to be covered here.
Skipping ahead to 2023, I've somehow figured out how to mod Minecraft Beta 1.8 with the ModLoader API like a champ, in spite of having very little Java experience and a bunch of other odds. The mod goes online for a good long period and after some much-needed crash fixes, I create the world named Mt Creep. I also have a bad thyroid at this point and being in my brain is a nightmare.
So, my Minecraft mod is well under development in a version called Pre-Alpha 023, and it's September of 2023. I walk into a tunnel in the Beta 1.8 Nether that I do remember paving to, but I find an offshoot tunnel to it that I don't recognize and post these:
Tumblr media Tumblr media
Fortunately I remembered what this location was and why I built it in the end, probably because it is now an integral part of a Mt Creep project I was chasing, the swamp base that is now connected to Mt Creep (the town) by minecart rail. People who have known my microblog for a while may have recently seen the video I posted of that railway. I visited the swamp again by going through that same portal in November that year. These days the swamp base is very well long-since built, and I hardly visit it now. Though at some point I'm supposed to add more vines and two kinds of moss stone.
Finally, we have the weirdest one I know about of them all. The flight of stairs in that same mod, in an older NBODE world, this year.
Tumblr media
This staircase seems like something I would do, but I don't remember building it. I don't remember the tunneling or the crafting of stairs or even the cave below it, even though I went up and down these stairs and checked out the cave. I'm particularly curious to know why I built it. I can't figure out even now what the actual purpose I had in mind was. Later on I visited the stairs again while the server was being used by two players.
Tumblr media
I believe my friend when he tells me that it's not there because of him, he thought of this staircase as being too much work to be him, and that his staircase would be not this tidy.
But I don't remember building it. It's a useful staircase, I like that it's there, and I'm not annoyed with it. But for all my ability to remember thousands of things, things in Minecraft new and old, things in real life that happened a long time ago, even if I never want to remember them again, hundreds of different traumas, things that fuel intrusive memory anger... I don't remember going here and spending several minutes carving virtual rock and building digital stairs.
9 notes · View notes
lucenare · 1 year ago
Note
is there a public mod list for terrimortis? purely curious whats all in there
I started to answer this on mobile and then it was bothering me so its laptop time. theres a couple notes for things but i can answer and questions about it. this is roughly it, excluding any libs an apis
Main Mods:
Ammendments
Armourer's Worskshop - lets us build cool cosmetics (Ezra's wheelchair, all the antenae, leopolds legs, etc)
Beautify Refabricated
Chipped
Clumps
Collective
Incendium
Indium
Structory
Supplementaries
Another Furniture
Better Furniture
Cluttered
Comforts
Convenient Name Tags
Cosmetic Name Tags - How we change our names!
Croptopia
Joy of Painting - pai n t in g mo d
Dark Paintings
Ferritecore
Lithium
Mighty Mail - mailboxes!!
Origins
World Edit
Spark
Trinkets
Stellarity
Twigs
Villager Names - note, makes it so wandering traders dont despawn
Building Wands
Macaws:
Bridges
Doors
Fences
Trapdoors
Lights and Lamps
Paths and Pavings
Windows
Clientside:
Build Guide
Entity Model Features - Lets us use custom models like with optifine
Entity Model Textures
CIT Resewn - allows us to use optifine packs
Continuity - connected textures
Iris Shaders
Jade - lil pop up when you hover over blocks
Jade Addons
JEI
Skin Layers 3D
Sodium
Xaero's Minimap
Xaero's World Map
Zoomify
Chiseled bookshelf visualizer
Cull Leaves
Custom Stars
JER
LambDynamicLights
More Mcmeta
More Mcmeta Emissive Texures
Sodium Extras
Bobby
40 notes · View notes
p-receh · 4 months ago
Text
EPISODE 1
preview arc
Spoiler alert as always.
.
.
.
.
.
A significant contrast of ambiance I noticed when watching Gentar arc was the TV version used a lighter tone than the comic counterpart. And ... did the area feel bigger and brighter here?
Tumblr media
I like that this version started in space rather than on earth. If they wanted to convince the audience the season 2 events happened within two months, they really needed to show how tight it was.
Unlike Sori and Windara's arc, this time I have to read the comic AND watch the show back to back.
This show has only 4 episodes but I felt like I'm missing something in the first ep.
For instance...
How the heck did the final goal from utilize Daun's 3rd tier, "Rimba/Jungle", into Api's 3rd tier, "Nova"?
Tumblr media
Even the comic didn't bring the name in Baraju's early arc. It was a suggestion from Hang Kasa, not an order by the other person(in this case: the Commander).
And the show didn't explain the reason behind the commander's brief about this and you expected us to easily grasp it??
Tumblr media
The show's barely started and I'm like:
Tumblr media
"Am I missing something from Windara???"
They may want to cut long expositions of who those were and immediately shortened into the main issue. Mixed some lines from Baraju's arc so they could quickly jump into action for the future tv show. That's also why the gang was in this cut to bridge the story between Gentar and Baraju.
Except, Papa Zola resigned from the main cast (along with Adu Du and Probe). This arc is officially his last appearance in the Boboiboy series. He will be focused more on his adventure with his family in the upcoming movie later this year.
Tumblr media
Also, I think it's funny, but when Gopal told Oboi to use Fusion elementals other than Frostfire because of how frequent it was, I laughed at the irony when Supra was in the mix yet he didn't even bring it up. The comic version made more sense in context since it was the only fusion he always used before Gentar by the way😅
Anywho, the TV ver started the story where the gang had to split up due to an unexpected distress signal during their journey to the planet Volcania. After that, it thematically followed the same structure as the comic ver, issue 15.
I said this in my preview post, we got to see Gentar's introduction right from the get go, especially in the opening vid. Aside from the main gang, the elementals are replaced with Gentar(but still got some tiny bits of Gempa & Hali for the sake of fusion part).
Oh, not to mention this is the first transformation where Oboi's front bangs cut sideways. Which IMO, that cut was cool as heck! Man, I love his suit on the tv ver! So good ... 😘
And I love how exaggerated his entrance was. The spotlight, the speeches, the ambiance, and every surrounding supported him as if he were the self-proclaimed celebrity in the block. And I love he purposely dragged his power-ups longer than the comic ver. It. Was. That. Necessary.
Tumblr media
(If you guys telling me that this is not Boboiboy and totally a new character, I will 100% believe in you)
Other props to the tv ver is his weapon. They want to show how big and important it is. Not just randomly pop-out. Thus, enter his magnetic powers to assemble all the scrapes into his iconic hammer.
To match the whole "lively" vibe of the theme, the animation took very high notes of Japanese anime style and geniously incorporated into their 3D renders.
Tumblr media
At first I thought they would go spiderverse-esque but using this method is also perfect for this arc. Kinda reminds me of Snoopy's art style actually.
Tumblr media
(and also ashamed of why Sonic Prime didn't do this method even though their flat animations had some potential wtf!!!! Yes I'm still salty over the 2nd and their final season other than the animation period)
Ehem.
Okay. Continue to the main antagonists themselves: the Probe x Jo robot hybrid and Adu Du introduction.
I appreciate the animators that still manage to make it as creepy as ... I'd say the FNAF / Poppy Playtime horror standard. Although kinda a bit downed with the lighter background setting.
Tumblr media
(now, will my country censor the horror part? Gonna be a nuisance I'd tell ya🙂‍↕️)
Whooo man when Adu Du entered the frame, my eyes instantly widened. I still remembered my reaction to how much his character development changes in the comic. But God I'm still not prepared in tv ver! Especially in this shot!
Tumblr media
Like, sir who are you and why you are so cool?!?!! 😳😳
Aaaaaand that's all for now, onto the 2nd episode!
15 notes · View notes
jcmarchi · 22 days ago
Text
Mauricio Vergara, CEO & Co-Founder of Kapwork – Interview Series
New Post has been published on https://thedigitalinsider.com/mauricio-vergara-ceo-co-founder-of-kapwork-interview-series/
Mauricio Vergara, CEO & Co-Founder of Kapwork – Interview Series
Tumblr media Tumblr media
Mauricio Vergara, CEO and Co-Founder of Kapwork, oversees the company’s operations, sales, and marketing. As a former small business owner, he experienced firsthand the strain that late payments place on growing companies. Later, during his time at Google and Unity, he saw how delayed payments negatively affected creators, stalling their ability to scale. Motivated to find a better solution, he began exploring the space—quickly realizing how limited the funding options are for small businesses and how challenging it is for funders to support them effectively.
This led to the creation of Kapwork, a platform that simplifies revenue-based financing by connecting businesses with capital providers through a transparent and frictionless experience. Designed to support the next generation of entrepreneurs, Kapwork helps businesses grow on their terms by turning future revenue into immediate funding.
What inspired you to start Kapwork, and how did your personal experience as a small business owner in Colombia shape your vision for transforming the factoring industry?
I used to run a restaurant and a catering business in Colombia. To this day, I remember when the first customer walked through the door and thinking to myself, “she is here because I have something to offer. Something that adds value.” It was such a surreal experience that I later validated at Google.
Working directly with app and game developers, I realized how much I loved helping people who can create something where nothing was before. Developers, like SMB owners, are the craftsmen and women of the modern age, and for me, there is nothing more satisfying than helping those who create value, generate wealth. That is why I started Kapwork with Pete Thomas. There was a whole sector, critical to our economy, that had been largely ignored by technology. AI has unlocked so much opportunity for transformation, why leave any frontier behind? It’s such a dense problem, and I was drawn to finally try my hand at building a company from the ground up. As Peter Thiel puts it in Zero to One: “A startup is the largest endeavor over which you can have definite mastery”.
You’ve described invoice factoring as a lifeline for small businesses. What does that lifeline look like today, and how is Kapwork reshaping it?
Cashflow is the lifeline for small businesses and without Factoring providing it for them, many of these companies would simply go out of business. In B2B it’s customary to see stretched out payment terms of 30 to 90 days outstanding. Imagine being at the mercy of large corporations waiting to get paid for services and products you’ve already delivered. Without Factoring providing the necessary working capital, these businesses can’t grow and in some cases, can even go out of business.  Factoring has been around for more than 100 years to bridge these gaps, purchasing invoices and delivering capital–but it hasn’t scaled yet. It’s a highly manual, risky and time-intensive process for thousands of factors around the globe.
Kapwork was started to overhaul these operations; we’re putting AI and automation to work to accelerate workflows, reduce errors and scale more effectively. This means more money flowing to businesses. Factoring helps businesses thrive, and Kapwork helps those Factors and the small businesses who rely on them.
Kapwork’s platform includes a self-healing AI agent. Can you explain what that means in practical terms, and how it enhances the factoring process?
Kapwork functions by deploying agents across a huge number of vendor portals to pull and populate data. Traditionally, these types of automations are hard to build and really expensive to keep online, when something changes within the web portal, for example. In that case, the entire process creates more headaches than benefits, and throws reliability into question. We had to develop an approach to prevent this.
When we say that our AI agents are “self-healing,” we mean that when an existing Kapwork AI agent experiences a fatal error due to some new, external change that prevents it from achieving its goal, this same Agent can invoke an AI process to evaluate what changed and indicate how it needs to be modified or replaced to continue working again. This capability is what gives Kapwork our durability, we always retrieve the needed data. When our current approach breaks because a web site changed, we don’t go back and come up with a new approach, instead we let AI do that for us automatically.
What were the biggest technical hurdles in building an AI platform that integrates with 4,000+ vendor portals and financial systems?
The first big technical hurdle is that most Vendor Management Systems (VMS) do not provide APIs, so Kapwork AI Agents had to be designed to navigate the identical VMS interfaces that humans use. These interfaces can be unreliable when it comes to automating data retrieval, with some changing form and function every few months, so we really had to develop a robust error correction system in our agentic framework so Kapwork Agents remain durable despite the inherently unreliable environment and can seamlessly address errors as they notice them.
The second big technical hurdle is the fact that every VMS is different. Although debtors that use Ariba and Coupa generally offer the same user interface for data retrieval, thousands of other debtors in “the long tail” follow no interface standard and present the data funders need in all sorts of non-intuitive, cumbersome ways. To stay efficient, we had to develop an agentic AI system that can explore any portal it hasn’t seen before and quickly figure out where to get the required data and how to write a reliable program to get it.
Finally, the lack of APIs made determining responsible password management protocols to facilitate automation an incredibly challenging hurdle. Non-traditional finance is in general guilty of bad password hygiene.  We often see multiple parties often sharing various account credentials to prove to each other that parties and counterparties are storing the right data in the right systems. So, when it comes to helping this industry automate data verification at scale, defining compliant security protocols and promoting best practices took a lot of research and discussions with operators working in the space today.
How does Kapwork use AI to verify invoices in seconds—something that used to take 1–2 days of manual effort?
Because Kapwork AI Agents work concurrently, for example, retrieving data from 20 portals simultaneously, we can verify invoice data at scale. The data can also then be populated automatically into a centralized dashboard for a comprehensive view of the pipeline.  This is in contrast to most financial teams in the business of verification today, wherein one person can only log into one portal at a time, find the data they need, download it, log out, log back in for their next client, etc., and proceed to move onto the next VMS when they are finished with the first one. Until today, people were doing all this work by hand, in serial fashion, making their way through a large book of confirmations that can take a single person days to complete.
What kind of data validation or fraud detection capabilities does your AI system offer, and how do they compare with traditional approaches?
Regarding fraud detection today, Kapwork describes our unique capability as “anomaly detection.” We are not currently applying any specialized AI to this problem but instead leaning on the fact that the data aggregation by Kapwork AI Agents’ naturally builds up patterns of how two companies do business together, what the value ranges are for things like amount and due date, and whether there is always a purchase order associated with a receivable. As patterns establish over time, Kapwork can detect possible fraud by showing a recent transaction or set of transactions fall outside the range of what is considered “normal” business, and alert the customer. Much of this could be missed by the human eye and normal processes. It’s an area of exploration and we’re excited to do more here in the future.
What role does AI play in improving deal flow and conversion rates for invoice buyers on your platform?
A Factor typically needs one to three months to underwrite an invoice seller. During that time, the seller remains desperate for cash while the factor keeps their capital idle at the bank. Kapwork’s AI instantly verifies invoice data, pulling records directly from debtor systems and delivering a vetted “AP snapshot” that allows credit teams to approve or decline the receivables facility in days, not months. The system also allows factors to verify invoices from their existing customer base more frequently without assuming additional headcount, enabling them to deploy capital faster.
You’ve held leadership roles at Google and Unity. What lessons from Big Tech helped you when transitioning into startup life at Kapwork?
In three ways. It helped me realize that I didn’t want to spend more time watching paint dry, shaped me as a leader, and gave me the confidence to realize that I can always try and figure things out.
In the words of Marc Randolph,  “everything is solvable if you’re willing to start and figure it out.” When I first worked at Google and Unity, I often felt impostor syndrome. Having come from a more humble background than my often IVY league colleagues, I used to doubt my worth but Google taught me that if I was there, it was for a reason. As I started to progress in my career, I gained the confidence to know that, even though I don’t know how to do something, all I need to do is start. With time, you can always figure it out.
Google also shaped me as a leader. It taught me that nothing scales faster than getting people behind a clear vision that they believe in. It also helped me understand what is needed to create a healthy environment where everyone can express their opinions without worry about retribution. Nothing helps a company grow faster than a smart team of proactive individuals rallied around a vision who are willing to challenge your thinking and commit.
Finally, Big Tech also helped me realise that I didn’t want to watch the paint dry any longer. There are so many hierarchies and embedded interests that it is always hard to challenge the status quo. The systems in place are designed to reduce risk and in some cases leads to employees being more preoccupied with signaling the good work they do than actually doing the work. I didn’t want that in my life anymore, especially when you see the world moving at the speed of light with all the recent developments in AI.
What’s your vision for how AI will further transform the financial services industry—particularly in underserved SMB segments?
Heterogeneity is good for society and bad for the lending industry. I find it fascinating how the diversity amongst small businesses makes it hard for them to find working capital solutions.
What literally makes SMBs special, makes it hard for them to grow. Just think about the diversity of SMBs. On one side, you have a family-owned hardware wholesaler and on the other, a boutique artisan bakery. Both are great in that they contribute uniquely, but the way these are run and operated is completely different. The variety of industries, financials, and business models makes it incredibly hard for lenders to make up their mind about them and for SMBs, to navigate the landscape. So how do you assess diverse businesses without a one size fits all approach? AI has been a tool to bridge that gap for a while but it was always cost prohibitive.
In my opinion, what is most interesting is that it finally makes economic sense to build solutions to tackle these challenges. More affordable AI will make it possible for the financial industry to build solutions for a highly fragmented and heterogenous SMB industry.
Thank you for the great interview, readers who wish to learn more should visit Kapwork.
0 notes
slainesthrone · 1 year ago
Text
So trying to recreate the qsmp mos pack (Personal use ig, just sawa bunch of mods I've never seen before and when Oh Shit thats exciting, then just diecided "Fuck it gonna gather as many of the mods they're using)
I Do Not have any of knowledge of any of the dungeons that they have so I ask if you have any insight let me know.
Here's the full list I have of confirmed mods and possible mods;
server runs on 1.20.1 qsmp 2024 mods: 1.Regions Unexplored 2.Croptopia 3.Biomes o plenty 4.born in chaos 5.exotic birds 6.enmey expansion 7.chocobo mod 8.farmers delight + some other food mods (Possibly multiple) with delight in the name. 9.Candlelight(?) 10.Handcrafted 11.Alex's mobs 12.Alex's caves (I can confirm because of a TRAP DOOR in the egg bakery. I'm in the trenches) 13.supplementaries 14.Beachparty 15.create 16.journey maps (Idk some map mod) 17.aquaculture 2 18.cluttered 19.chimes 20.fairy lights 21.FramedBlocks 22.Chipped 23.paraglider 24.Another furniture mod 25.waystones 26.connected glass 27.Create deco 28.Candlelight dinner 29.MOA DECOR 30.Tanuki decor 31.Orcz 32.Modern life 33.Bakery 34.Friends&Foes 35.Meadow 36.Abyssal decor 37.Twigs 38.lootr 39.when dungeons arise(to be confirmed) 40.nether's delight 41.rats 42.Additional lanterns 43.Alex's delight 44.Additional Lights 45.AstikorCarts Redux 46.Athena 47.Awesome dungeon net..(work?) 48.BOZOID 49.Apothic Attributes 50.AppleSkin 51.Balm 52.Better Archeology 53.Better ping Display 54.BetterF3 55.Aquaculture Delight 56.Bookshelf 57.Bygone Nether 58.CC: Tweaked 59.Artifacts 60.Camera Mod 61.Cataclysm Mod 62.Catalogue 63.Citadel 64.Cloth config v10 API 65.Clumps 66.Comforts 67.Configured 68.Controlling 69.CorgiLib 70.CoroUtil 71.Corpse 72.CosmeticArmorReworked 73.Create : Encased 74.Create Confectionery 75.Create Slice & Dice 76.Create: Interiors 77.Create: Steam 'n' Rails 78.Create: Structures 79.CreativeCore 80.Creeper Overhaul 81.Cristel Lib 82.Cupboard Utilities 83.Curios API 84.Customizable Player M(???) 85.Delightful 86.Distant Horizons 87.Domestication Innovations 88.Duckling 89.Dynamic Lights 90.Elevator Mod 91.Embeddium 92.Emotecraft 93.Enderman Overhaul 94.EntityCulling 95.Nether's exoticism 96.YUNG's (x) Mods (bridges, better dessert temples,mineshafts only ones i can confrim, might be all but idk for sure) 97.Securitycraft 98.Vinery (Confirmed because of Tubbo's drinking binge at spawn yesterday) 99.Mr.Crayfish (Furniture confirmed, possibly more) 100.Naturalist 101.Tom's simple storage
If you know or noticed mods that haven't been listed, reply/reblog with them please.
Things are numbered for my archival reasons, as some mods come in multiple separate mods (such as YUNG's) the numbering will not show the true number of the mods on the server.
I also have not checked the needed mods that any of these mods may need so.
(Please note that there may be spelling/grammar mistakes in the names of this mods!)
59 notes · View notes
somepikminpostcards · 4 months ago
Text
Tumblr media
0 notes
atcuality3 · 1 month ago
Text
Simplify Decentralized Payments with a Unified Cash Collection Application
In a world where financial accountability is non-negotiable, Atcuality provides tools that ensure your field collections are as reliable as your core banking or ERP systems. Designed for enterprises that operate across multiple regions or teams, our cash collection application empowers agents to accept, log, and report payments using just their mobile devices. With support for QR-based transactions, offline syncing, and instant reconciliation, it bridges the gap between field activities and central operations. Managers can monitor performance in real-time, automate reporting, and minimize fraud risks with tamper-proof digital records. Industries ranging from insurance to public sector utilities trust Atcuality to improve revenue assurance and accelerate their collection cycles. With API integrations, role-based access, and custom dashboards, our application becomes the single source of truth for your field finance workflows.
4 notes · View notes