Andersen is a software development company with a full cycle of services. For over 13 years, we have been helping enterprises around the world to transform business by creating effective digital solutions with the use of innovative technologies.
Don't wanna be here? Send us removal request.
Text
Three IoT Highlights for Advanced Healthcare Services in the 2020s
With the advancement of technology, the healthcare industry is enjoying opportunities it has never been given before. These include but are not limited to telemedicine, receiving analysis results within minutes on a mobile app, or tracking blood sugar level and heart activity with the use of ultra-sensitive wearables. The universe of IoT in health services is quite vast. IoT can be described as items that read the information of their surroundings with the help of sensors and transmit this data to servers or electronic devices via the Internet. The prospects and advancements of the 5G wireless network and the growing demand from the medical community for digitalization and no-touch attendance open up outstanding opportunities for IoT implementation in healthcare.
In the age of individualized treatment, various monitoring devices that provide highly-accurate diagnostics become indispensable. Patients are not the only ones who benefit from implementing these trackers. IoT technology helps healthcare centers better administrate their facilities and, moreover, proves itself to be self-financing, which ultimately results in savings. Along with that, IoT nurtures such breathtaking inventions as robotic surgery or magic pills with sensors that are activated inside a human body and control medication intake. The disruptive IoT will not only lead us to the bright future of medicine - it has become a reality the world is already practicing today, and it promises a competitive edge to those who start implementing it. You can see this for yourself by reading the full version of the article with real cases of successful application of IoT technologies in healthcare.
0 notes
Text
Five Simple Steps to Write an Engaging QA Report

QA Reporting is a stage of development when the test outputs are communicated to those involved in the project. Based on this, corrective actions are made, and the decision on whether the product quality is acceptable for release is taken. QA reports also have a tangible effect on the distribution of the company’s financial resources. After considering the received data, executives either increase investments into the project or cut financing. Therefore, an informative and precise statement is worth its weight in gold for product development to succeed.
So what are the indispensable components of a top-level QA report? And what attributes make it valuable to the manager?
First of all, make your report as simple as possible. Don’t use complicated vocabulary where possible and avoid including too many acronyms. Add tables or diagrams to your document and insert screenshots of the blockers revealed. This will help you to systemize the notes, and the CEO will be able to notice the most relevant data at first glance. It is also a good practice to use the same report template so that your supervisor immediately knows where in the document to look for the information they need.
Secondly, pay attention to the data correctness. Review every report before submitting to avoid grammar mistakes and factual errors. Draw your manager’s utmost attention to the deviations that are critical instead of mitigating them to preserve your reputation. The sooner the flaws are revealed, the more likely they will be removed with no further consequences.
Thirdly, always keep in mind whom you are addressing your report to. The area of the QA manager’s interest differs from that of the project client, doesn't it? And the development team’s main issues of concern are not the same as those of customer service. In view of this, modify your statement depending on its recipient.
Fourthly, include all essential metrics in the report. They are:
the general data on the tests, namely their types and entry data;
the objectives and scope of testing;
the particulars regarding the devices and hardware used for the testing;
uncovered errors, the possible ways of their elimination, and the actions to be taken;
the conclusion on the product quality and whether it matches with the outlined exit criteria.
If you are reporting to the company’s executive, you may include the metrics that directly influence revenue. These refer to customer profitability when possible risks influence relationships with customers. You may also mention the overall number and complexity of the tasks to bring the QA department’s performance into view.
Lastly, when drawing up your statement, take into consideration the business aspect of each matter. You may flood your statement with numerous figures and undeniable statistical data. However, without some dynamics and a clear call to action, your document will remain abstract to the company’s decision-makers. So don’t hesitate to guide them by giving an assessment of the information provided and proposing particular steps.
There are more helpful tips that can give the QA report considerable practical importance. Learn more about how to get the most out of your QA processes on the Andersen QA Community webpage.
We carried out the research and found out that even the order of data in your document makes a difference. The report writer analyzes the test results and has a role in determining their strategic value for the entire project. After all, it's the report writer who calls the audience’s attention to the urgent issues that endanger the product quality, customers’ trust, and the expected profits.
0 notes
Text
Five Application Scenarios of AI in Banking
Over the past decades, banks have been improving their ways of interacting with customers. They have tailored modern technology to the specific character of their work. For example, in the 1960s, the first ATMs appeared, and ten years later, there were already cards for payment. At the beginning of our century, users learned about round-the-clock online banking, and in 2010, they heard about mobile banking. But the development of the financial system didn’t stop there, as the digital age is opening up new opportunities — the use of Artificial Intelligence. By 2023, banks are projected to save $447 billion by applying AI apps. We will tell you how financial institutions are making use of this technology in their operations today.
AI-powered chatbots
Chatbots are AI-enabled conversational interfaces. This is one of the most popular cases of applying AI in banking. Bots communicate with thousands of customers on behalf of the bank without requiring large expenses. Researchers have estimated that financial institutions save four minutes for each communication that the chatbot handles.
Since customers use mobile apps to carry out monetary transactions, banks embed chatbot services in them. This makes it possible to attract users’ attention and create a brand that is recognizable in the market.
For example, Bank of America launched a chatbot that sends users notifications, informs them about their balances, makes recommendations for saving money, provides updates to credit reports, and so on. This is the way the bank helps its clients to make informed decisions.
Another example is the launch of the Ceba chatbot, which brought great success to the Australian Commonwealth Bank. With its help, about half a million customers were able to solve more than two hundred banking issues: activate their cards, check account balances, withdraw cash, etc.
Mobile banking
AI functionality in mobile apps is becoming more proactive, personalized, and advanced. For example, Royal Bank of Canada has included Siri in its iOS app. Now, to send money to another card, it’s enough to say something like: "Hey, Siri, send $30 to Lisa!" - and confirm the transaction using Touch ID.
Thanks to AI, banks generate 66% more revenue from mobile banking users than when customers visit branches. Banking organizations are paying close attention to this technology to improve their quality of services and remain competitive in the market.
Data collection and analysis
Banking institutions record millions of business transactions every day. The volume of information generated by banks is enormous, so its collection and registration turn into an overwhelming task for employees. Structuring and recording this data is impossible until there is a plan for its use. Therefore, determining the relationship between the collected data is challenging, especially when a bank has thousands of clients.
There used to be the following approach: a client came to a meeting with a bank employee who knew their name and financial history and understood what options were better to offer. But that's history now. With the wealth of data coming from countless transactions, banks are trying to implement innovative business ideas and risk management solutions.
AI-based apps collect and analyze data. This improves the user experience. The information can be used for granting loans or detecting fraud. Companies that estimated their profit from Big Data analysis have reported an average increase in revenue by 8% and a reduction in costs by 10%.
Risk management
Extension of credit is quite a challenging task for bankers. If a bank gives money to insolvent customers, it can get into difficulties. If a borrower loses a stable income, this leads to default. According to statistics, in 2020, credit card delinquencies in the U.S. rose by 1.4% within six months.
AI-powered systems can appraise customer credit histories more accurately to avoid this level of default. Mobile banking apps track financial transactions and analyze user data. This helps banks anticipate the risks associated with issuing loans, such as customer insolvency or the threat of fraud.
Data security
According to the Federal Trade Commission report for 2020, credit card fraud is the most common type of personal data theft.
AI-based systems are effective against malefactors. The programs analyze customer behavior, location, and financial habits and trigger a security mechanism if they detect any unusual activity. ABI Research estimates that spending on AI and cybersecurity analytics will amount to $96 billion by the end of 2021.
Amazon has already acquired harvest.AI - an AI cyber security startup - and launched Macie - a service that applies Machine Learning to detect, sort, and structure data in S3 cloud storage.
Conclusion
There are more ways to apply AI in the finance industry. According to an OpenText survey, 80% of banks recognize the benefits of AI, 75% of them already make use of this technology, and 46% plan to implement AI-based systems in the near future.
AI-powered solutions become an integral part of companies’ development strategies, helping them to remain competitive in the market. This technology minimizes operating costs, improves customer support, and automates processes. Andersen’s financial experts will help you implement these and other software products for the digital transformation of your banking, investment, or insurance business.
0 notes
Text
Key Features of Accounting Software to Improve Business Processes
It takes companies lots of time and effort to perform their accounting and bookkeeping. Manual journal entries and reporting require certain accuracy and continuous attention. The accountants need to be closely familiar with all financial regulations as they are subject to constant updating. All the above adds complexity to accounting and makes it obscure to business owners.
Accounting software is designed for automating core accounting processes. It allows companies to record, calculate, and analyze their data with the help of AI-driven algorithms. Accounting systems are good at performing repetitive time-demanding operations, and they benefit business growth. How is that possible?
First of all, accounting software contributes to the protection of companies’ most private information, and that is financial data. Files and documents remain secure with the help of access control and multiple firewalls, while figures and computations are subject to end-to-end encryption. Additionally, regular backing-up prevents data loss. Such confidentiality enhances clients’ trust and guards firms against financial losses.
Next, automated solutions improve data visibility. The figures are automatically posted to sub-ledgers devoted to accounts payable, accounts receivable, and purchases. The final data is summarized in a general ledger, and automated reports are generated. This properly stored and structured information helps business owners form a clear opinion of their financial statements and make well-grounded strategic decisions.
Moreover, accounting apps are easily adaptable to the requirements of each business. Custom accounting software is flexible and can be designed for a particular industry, whether it is healthcare, banking, or manufacturing. Business volume and specificity determine the system’s options and function package. It means that every company can implement accounting systems in its workflow, regardless of its size and area of interest.
These and the whole range of other substantial advantages allow financial analysis software to quickly and smoothly perform the key accounting operations. Among them are data recording, computing, regular reporting, fiscal analysis, financial forecasting, tracking of accounts payable and accounts receivable, inventory management, purchase order management, and many others. This wide scale of features produces the belief that automated systems compete with accountants and will replace them in a short while.
In spite of all its useful features, accounting software doesn’t claim to completely replace accountant officers. It will always remain an assistant that makes the work of specialists more accurate and saves them time so that they can concentrate on complex tasks. Financial analysis and strategic planning, as well as explaining results and consulting, surely require human intervention. It is the combination of manual and automated accounting that contributes to companies’ growth, along with the professional advancement of their accountants.
Every business model is unique. The enterprise’s customized accounting solution depends on its size and location. Record-keeping of small companies differs from multi-layer database software for big corporations. But the main criteria to most businesses when choosing appropriate accounting software is often its price, user-friendliness of the interface, limitation on the number of users and transactions, and the degree of integration with third-party systems. Also, a lot depends on whether it is a cloud-based solution with non-stop real-time access to documents and transactions or on-premises software. Learn more about accounting software development to make the right choice.
0 notes
Text
Five Beneficial Applications of Artificial Intelligence in Compliance
The more the world becomes immersed in digital banking with its e-payments, online services, crypto-currency, etc., the more demanding banks’ and clients’ data protection requirements are. It’s no surprise that digitization in financial services, along with manifold advantages, forms a fruitful ground for all types of cyber frauds. Therefore, companies need to comply with GDPR, AML, or KYC regulations not only to avoid penalties from the state but also to protect themselves from damaging their relationships with clients.
Artificial Intelligence helps to find the easiest and fastest exit from a maze of numerous guidelines. AI in banking deals splendidly with recurring processes and monotonous actions, such as classifying tons of seemingly uncoordinated paragraphs and figures, facilitating firms’ and clients’ security with the help of cutting-edge identification techniques, and bringing the companies’ documents into correlation with applicable legislation. And most of these processes run on the back-end without being noticed by the customers while keeping the interaction between those customers and their banks productive.
Although the adoption of AI in financial processes doesn’t lead to tangible results in a blink of an eye, this technology will swiftly become integrated into the current system of the organization thanks to continuous Machine Learning and employment of various algorithms. These algorithms are indispensable for dealing with compliance issues in many ways. Here are the most effective ways to use AI for compliance with the most complex and controversial regulations in the financial sector.
0 notes
Text
Why Is Software Testing So Important?
What if IT guys are deceiving you? What if testing is an absolutely useless process that doesn’t bring any value and wastes your project budget? Spoiler: no, they aren’t deceiving you! To get more details, let’s get the facts straight. Now, we are going to outline four main benefits you get from testing. Each of them transforms into real profit or savings.
1.You save money
That's right, you pay testers and save. How so? Testing is a very cost-effective thing, you know. Software development is a continuous process made up of many stages building on one another, like levels of a pyramid. Disassembling the pyramid of your application can be more expensive than paying testers right away to diligently check each level.
And what will happen if you sweep this mistake, which has long been neglected, back under the rug, and bring the product into release with it? Let’s see. You've created an awesome app of your dreams, run a massive advertising campaign, and are ready to dive into gold like Scrooge McDuck. But on the day of release, all your dreams are shattered by a ridiculous little mistake: users can't log in. Of course, you slap yourself and correct it. But users who, swearing, went to your competitor, cannot be retrieved. You've lost your money, reputation, appetite, and God knows what else. And all this is due to a ridiculous bug that testers would never miss. So, is it a profitable investment? What do you think?
2. Your product is secure
The Internet is a dangerous place, guys. If your product collects user information, one day somebody will try to steal it. If transactions are carried out through your application, one day somebody will try to steal them. If you once sent nudes, one day… Well, you get it. The user chooses the platforms they are ready to trust with their security.
You can read more about application security testing and a couple of obvious advantages of an integrated approach to testing in the main article.
0 notes
Text
Capital Management 2021: Focus on Customers by Means of Digital Technologies

Specialists in asset management, especially financial advisors and managers, are pondering how to preserve the assets under their management after they are passed down to the next generation. Part of these assets will go to a pension or cover the expenses of the previous owner, but most of the assets will go into the hands of inheritors. And one should bear in mind that representatives of the new generation approach the capital in a totally different way.
The difference in attitudes towards capital between millennials and previous generations
Millennials not only have a different attitude towards money but also expect a different approach from financial advisors and managers. Financial managers used to build more personal relationships with clients and organized regular meetings. Investors of the new generation don't care much about large offices and business lunches in a fancy restaurant; they stick to a more practical approach. While the personal relationship between the client and the manager is still important, most of the transactions take place online. Companies need to find new solutions for communicating with their customers, thus increasing the level of service.
Some conservative managers doubt that digital solutions for interacting with clients will actually be in demand. In practice, it turns out that not only millennials but also representatives of the older generations willingly use various applications to control their finances.
Millennials’ abandonment of financial advisors and managers
What are the expectations of today's generation of clients, and how can asset managers respond to these demands? One of the most important innovations is that asset managers now put emphasis on self-servicing. Clients have begun to conduct their own market analysis and choose which company, fund, or stock exchange they should invest in.
There is a huge amount of information in the public domain for study, and this allows private investors to independently analyze assets and make decisions. Now clients better understand the changes in quotes and commissions, as well as see the emergence of new tools for interaction with companies. Online trading is available round the clock - investment information is updated 24/7.
Progress changes regulatory requirements
Along with technology development, not only customer behavior but also the requirements of regulatory authorities are changing. Regulators demand maximum transparency from companies, and inspections are becoming more complex. Compliance without the use of modern tools is becoming increasingly difficult. The latest technologies are now an integral part of the successful work of companies, and those companies that don’t have a high-quality IT infrastructure will find themselves in a difficult situation.
The corresponding software can either be developed or purchased as a ready-made solution. As a rule, the larger the investment company is, the more it needs an original application developed for its processes. This is proven by Andersen’s experience: the majority of our company's clients operating in the financial sector are large banks and corporations. However, we have also provided specialists to help small FinTech startups - they have noticed the benefits of outsourcing as well.
It was only yesterday that communication with clients was kept solely by phone calls; today they have been replaced by text messages via messenger apps and social networks. When it comes to sensitive data, investment companies use more reliable platforms designed specifically for this purpose. It is possible that, in the future, there will be new ways to communicate and support the work of managers with clients. Technology should provide clients with the ability to independently control the situation, but at the same time, the investor shouldn’t completely lose touch with the asset manager.
There is no truth in thinking that now the client is faced with a choice between the "old school" and the self-service mode. These approaches can be successfully combined. The client is often able to make a decision on the deposits on their own, but sometimes they still need the help of a professional consultant. For instance, if a travel agent can provide a client with a favorable discount and offer travel on the most favorable terms, why would a tourist search for and book a hotel independently? If an investment manager rightfully argues that they are able to provide a subsequent high return, why not pay them 1% with the guarantee of a return of 3% of the investment amount?
0 notes
Text
Benefits of IT Outsourcing for Application Support
Outsourced IT support, or delegation of application support to a third-party company, is one of the most requested types of IT services. It is gaining more and more popularity. Entrusting external contractors with app monitoring and maintenance has become something more than a matter of cost-effectiveness and better performance. It allows firms access to outstanding resources and knowledge from all over the world. The ordering party receives a helping hand from certified and trained professionals instantly. Outsourcing IT support even allows small businesses to compete with well-recognized brands. How? Simply by choosing the right vendor. The latter affords a modern approach and know-how that the top-ranked market-players benefit from. Moreover, an experienced provider takes the responsibility for the risks arising and offers you a better understanding of your own processes.
There are multiple forms of IT support outsourcing available today. Therefore, the companies can choose the cooperation model that suits them the best. The development of state-of-the-art technologies enables firms to entrust their incident management to an expert from a different part of the globe. Or if their processes require the personal presence of software engineers, they can opt for a partner in their own country. They can also end up combining the offshore and onshore approaches. The businesses might want to transfer their support responsibilities to a third party completely, or just partly. They can remove themselves from the application support load for a long period, that is by regularly paying a subscription fee to the outsource vendors. Or make use of incident-based assistance, crossing bridges when they get to them. It all depends on the scope and complexity of the tasks that are subject to outsourcing. Apart from this, the teams should keep some other things in mind when choosing the best solution. We strongly recommend taking into consideration the language barrier, different time zones, a cultural divide, etc.
Another tangible advantage of outsourcing IT support is flexible pricing. Along with time-honored methods such as paying for a clearly specified amount of services, there are some innovative strategies. For instance, value-based pricing focused on productivity and real results. In this case, the provider’s effectiveness is tracked by measurable performance metrics. They include but are not limited to application downtime, malfunction frequency, and customers’ feedback. The above-described model minimizes the risks of inadequate scope and quality of services. When deciding on the best pricing solution, take into account such factors as the specific character of your business, the expertise of in-house IT management, and the multiplicity of tasks.
With all that said, how to select an appropriate IT support outsourcing company? What red flags will help you identify potential pitfalls on the stage of negotiations, even before you sign the contract? How to evaluate the provider’s performance based on the reports received from your own team and the clients’ feedback? And what steps are to be taken if you find out that the results of your vendor’s work are unsatisfactory? Learn all you need to know about IT support outsourcing from the article on our community’s page.
0 notes
Text
The Main Beneficial Aspects of QA Automation for Business
The frequency at which applications are released has been extremely fast in the last few years. Companies are constantly developing and testing hundreds of products. At the same time, customers remain quite picky about the user interface’s clarity and usability. Therefore, the main concern of IT firms is how to maintain both the high quality of the product and the rapid pace of its development. Compliance with data security regulations is also important, as well as optimization of the company’s resources.
QA automation involves the execution of testing processes by AI-driven systems, leading to efficient time management and better productivity of QA engineers. Test automation is good at performing simple monotonous tasks throughout the whole application development. This reduces the human factor and provides higher product quality. Additionally, QA easily manages authentication and authorization testing, which ensures the company’s compliance with the current data protection directives.
Why are many businesses still using manual testing then? Among the most common reasons, developers name incompleteness of the test environment and test data, immaturity of AI-oriented solutions, and frequent application modification. Whatever the case, any issue can be solved by improving DevOps and Agile teams’ expertise and introducing better test data and test environment management.
Each team, with its unique design and development solutions, needs a personal approach to the implementation of QA automation into their ecosystem. Our article offers a few tips on how to balance manual and automated testing, enhance the efficiency of the company’s workflow, and get the best out of QA.
0 notes
Text
How to Successfully Implement DevOps Techniques into Your Project

Any innovation should start with a solid business case, and DevOps is not an exception. Why should businesses invest in the changes needed to implement DevOps? In this article, we will answer this question and discuss how to effectively implement DevOps methods in a project.
Setting the stage for DevOps implementation
IT technologies have a huge impact on the development of modern businesses. These changes are described in detail in Why Software Is Eating The World by Marc Andreessen. It's not hard to cite a few examples of tech startups that have already made a significant contribution or even changed the entire industry. For this purpose, one can simply look at the radical changes in the retail, transportation, and hospitality industries, which were caused by software innovations from companies such as Amazon, Uber, and Airbnb.
The focus on streamlining workflows and shortening the delivery lifecycle with IT is powerful leverage in business development. But if you improve only business processes, neglecting the convenience of customers, all efforts will be in vain. A manager needs to think about both the optimization of the development process and the quality of the product that the user will receive.
DevOps is an approach aimed at optimizing the product lifecycle and its delivery to the consumer. Not only the application itself but also the processes of its development can be subject to changes. According to Jeffrey Moore, communication systems can evolve quickly, but this rule doesn’t apply to the budget. This is due to the fact that financing systems can’t be rebuilt quickly. Money distribution is a delicate process, so changes should be made with extreme caution.
Benefits of implementing DevOps
First and foremost, you need to focus on the value of the business itself. An application is just a tool, the aim of which is to increase or maintain business value. You need to evaluate the optimization program and identify its benefits in monetary terms. It is recommended to determine the hourly profit that this program is able to generate.
The task of DevOps is not only to optimize an application but also to speed up its development. By eliminating post-release completions and changes, DevOps accelerates all processes, which results in increased profits. In the contrary case, constant alterations of existing systems can lead to large losses.
Alongside assessing the profit, it is also necessary to determine the amount of lost benefit. To identify this value, you need to calculate how many times a year there were disruptions in your software. It doesn't matter why they happened: due to unstable program operation or due to its modifications. In addition, you need to calculate the losses in monetary terms and then analyze how DevOps can stop the leakage and contribute to increasing profits.
DevOps helps to assess the quality of work of all employees and teams - managers, programmers, BAs, testers, etc. For example, at Andersen - a company engaged in software development for hundreds of businesses worldwide - our standard is to include DevOps engineers in all teams larger than 5 people. A set of DevOps capabilities provides for the detailed study and planning of all elements of the business, as well as their contribution - both positive and negative.
Benefit evaluation
DevOps can have a significant impact on net profit. But if you procrastinate the implementation of DevOps, you may be left with no profit at all. After all, one must not forget about competitors who may be more decisive.
The argument that DevOps is good is simply not enough. You should understand how important it is for your business, not just for its development but for its existence as well.
You can start with a pilot version of a DevOps design to get a rough idea of its benefits. We wouldn't recommend just waiting for the right moment for development - the right moment is now. Growing and changing must be the daily routine of your business; otherwise, you cease to be competitive. To realize how weighty these words are, remember how many projects stopped existing for failing to heed this advice.
0 notes
Text
How the Banking System Is Changing Due to Customer Expectations

The financial services industry is undergoing an important development stage. This is not about the creation of some brand new products. Banks are still offering the same things as 20 years ago: account services, debit and credit cards, loans, etc. Compared to the changes taking place in other sectors of the economy, the current situation may seem stagnant. In fact, this is not the case - it’s just that changes in the banking sector are occurring in other directions.
A new era without product innovation
While banks are focusing on classical money management, major changes are taking place in other areas of the economy. For example, the desire to attract more participants to stock trading has led to the emergence of private online investments. The financial market, which used to be closed, has become available to almost everyone.
As for banks, they bet not on new products but on improving the old time-tested ones. Digital services and delivery methods are now in the spotlight. Smartphone owners quickly got used to doing with a couple of clicks what previously required going to a branch and standing in long queues. The emergence of mobile banking applications has significantly reduced the number of bank branch visitors. Why go anywhere if you can perform any operation on the Internet? This approach became even more relevant during the pandemic.
Partnership programs have become one of the key areas of financial service development. All financial players, ranging from traditional banks to insurers, are trying to expand their offerings to attract new customers and increase the loyalty of old ones. This is how really favorable conditions are created - for example, credit cards with cash back or discounts on air tickets. However, coming up with an offer that is attractive to absolutely everyone is not easy. People renting an apartment are not interested in insuring it, and stay-at-home people don’t need travel insurance. Therefore, banks have to try hard.
Peculiarities of affiliate programs
The development of affiliate programs has changed the customer mindset. Nowadays, people choose a bank by not only paying attention to the interest on deposits and loans but also considering additional features. Financial institutions have to adapt to the lifestyle of a new generation of customers and follow them into the virtual space. Today, there are banks that don’t have the usual physical branches at all, and this doesn’t make them less successful, as they focus on other useful services instead.
Affiliate programs that are focused on digital lifestyles and niche services create tremendous potential for attracting new clients. Cooperation of financial institutions with major manufacturers of appliances, clothing, and other goods makes it possible to create exclusive offers with additional benefits. For example, the bank N26, having an exclusive agreement with Adidas, allows its clients to buy products of this brand at more favorable prices.
Another tool for increasing loyalty is the support of social and ecological initiatives. Such programs are most widespread in Europe. The clients of these banks know that their money is used not only on developing this organization but also on reducing carbon dioxide emissions into the atmosphere, planting trees, and other environmental campaigns.
All this makes the relationship between banks and customers more complicated. From a one-way street, it has transformed into a data exchange network. Now financial institutions need to offer the most favorable conditions and, moreover, track demand and look for their niche to create an affiliate program that is attractive to the maximum number of people.
The information that customers reveal about themselves provides a rich field for analysis and creation of such packages. This is how such offers as direct cashback, personalized discounts at online stores, or subscriptions to streaming services emerge. The client, in turn, doesn’t simply purchase a one-time service but starts thinking from a business standpoint, expecting additional profit from investments.
However, gaining access to a large amount of personal information comes with great responsibility for its security. It’s not enough for banks to simply develop applications that provide digital services to customers. What is needed is a powerful and flexible IT architecture that will both offer customers convenient services and ensure secure operation.
The legacy systems of some banks are ineffective for solving such problems. The development of new architecture is an indispensable prerequisite for full digitalization. Forming a team to implement such a project requires competence in various fields, and very few companies have all the necessary specialists on their staff. For over 13 years, we at Andersen have been helping banks and companies from other industries to form IT teams and develop software of various levels. A rapid movement towards digitalization is hardly possible without the involvement of outsourcing. Our company provides IT experts specializing in a wide variety of fields for companies of all sizes, from huge corporations like Siemens or MediaMarkt to small startups.
0 notes
Text
Bitcoin is Already in the Top 10 Most Expensive Exchange-Traded Assets in the World

A year ago, I wrote an article in which, together with experts in the finance and crypto industry, I tried to figure out the chances of bitcoin rising again and “flying to the moon.” It seemed that the odds were pretty good — both the technical and fundamental analysis and additional factors indicated that.
Then the corona crisis hit, and it seemed as if that should have dramatically stimulated bitcoin’s growth. People are staying at home, banks are hastily trying to go digital, and not all of them succeeded. It’s time to install crypto wallets and start ordering products online with bitcoin payment, isn’t it?
It seems to be so, but here the service infrastructure as a whole was not ready. Unfortunately, the number of retailers accepting cryptocurrencies is still relatively small. The bitcoin breakout happened neither in the spring nor in summer. Instead, bitcoin took off in the fall, and how!
At the end of November last year, bitcoin began increasing rapidly, and already in mid-December, it surpassed a historical record amount of $19,500 (according to Coinmarketcap). The first cryptocurrency didn’t stop there — on January 8, 2021, its price reached $41,300. This mark was temporarily the highest amount for a month up until earlier this week when bitcoin jumped to $46,000.
The obvious stimulus for bitcoin’s growth on February 8-9 was the Tesla report’s data to the US regulator SEC. The report indicated that the company invested $1.5 billion in bitcoin. In a commentary, Tesla said that corporate rules regarding investing were updated in January: the company wants to make its asset portfolio more flexible.
To sum up, it’s a kind of double hype — the company that, according to analysts, is overvalued 10-15 times, provoked a new wave of growth of the asset, the value of which no one can really estimate. However, the capitalization of bitcoin is not difficult to determine, and at the moment, there are only six exchange-traded assets in the world with a higher capitalization — these are Tencent, Google (Alphabet), Amazon, Microsoft, Saudi Aramco, and Apple.
Nevertheless, whether the latest spurt of bitcoin was due to the hype or not, there are fewer doubts about cryptocurrencies’ global success every year. Yes, this take-off will be more than likely followed by a logical correction of 20-30%. Maybe it will take several more years to reach the $100,000 mark, or perhaps we will see such a price in the near future. Currently, for myself, I draw at least two conclusions:
1. Bitcoin will live.
2. If you invest in bitcoin right now, there is a high risk of ending up in the situation of those investors who bought bitcoin in December 2017 :).
0 notes
Text
Using RTLS to Fight COVID-19
The COVID-19 pandemic has had a significant impact on society, changing our habits, ways of interaction, and work processes. Today, companies more than ever strive to adhere to safety measures while maintaining optimal performance. And a real-time locating system (RTLS) helps them with that. How?
RTLS allows you to locate people or objects and navigate. Companies are successfully using this solution to optimize workflows and keep their employees safe amid coronavirus. Below are four ideas for utilizing such technology.
No. 1 Social distancing
RTLS allows you to locate people in real-time to monitor compliance with social distancing. For example, employees of an enterprise can wear small tags on their clothes or wrists. These labels track the distance between colleagues and send a vibration alert if they get too close to each other. This type of solution is suitable for offices, factories, hospitals, and other facilities.
No. 2 Contact tracing
Contact tracing is another crucial element in the fight against the COVID-19 threat. The described technology helps to find out where infected people were and whom they interacted with. According to Thomas Hasselman, Chief Marketing Officer at Quuppa, RTLS can back-check people’s locations. Based on the data collected, all the contacts will receive notifications if a person tests positive for COVID-19. Mr. Hasselman believes that after the current pandemic, contact tracing based on RTLS will become the new standard in the workplace, especially in hospitals.
No. 3 Quarantine monitoring
Today, many governments are moving towards a TTI strategy: test, trace, and isolate. But how to make sure that people follow the conditions of quarantine? Here again, RTLS comes to the rescue. For example, healthcare workers can use this technology to create a virtual geofence for each quarantined person. In this case, the patient must wear an RTLS tag provided by the health authority. If they try to leave the designated area or remove the mark, the system will notify the officials.
No. 4 Tracking of assets and people in hospitals
Using RTLS, one can locate any wheelchair, medical device, hospital bed, patient, doctor, and other object or person in real time. Moreover, the system displays their statuses. For example, whether a patient has recovered, whether a bed is free, whether the equipment is ready for use or needs repair, etc. That will speed up patient care in times of the pandemic and in the context of a large flow of people during routine examinations and calls for emergency medical care.
For just over a year, society has been blocked by a pandemic. Scientists predict that in a year or two, the coronavirus will join the cohort of seasonal viruses that cause respiratory diseases in people. Today is the time to restore human relationships, work processes, and financial resources while staying safe. People should use all kinds of tools, methods, and technologies to do so. RTLS is just one of the unique solutions that will help us achieve these goals.
0 notes
Text
Digitalization Comes into Clinical Practice in Intensive Care Units

Recently, the intensive care units (ICU) have been paying special attention to the systematization of monitoring the condition of patients. Most ICUs are already furnished with modern medical equipment that automatically monitors functions of the patient's body and injects medications.
However, even with the complete automation of reading patient indicators, there is one problem: all these data are scattered, and they have to be manually processed and transferred to observation charts, medical records, and logbooks.
The staff of the ICU is under constant mental and physical stress, as they bear a huge responsibility. In addition, with resuscitation measures, there is simply no time left for paperwork. This can lead to the loss of some information. This, in turn, results in incorrect decisions, financial losses, and medical errors.
Filling out medical records accurately is time-consuming, and sometimes ICU staff have to spend hours that should be dedicated to caring for patients on this activity instead. In yet other cases, after a three-hour operation, the doctor spends the same amount of time filling out the documents.
There have already been attempts to automate data collection in ICUs in order to release medical personnel from the routine of filling out cards and books. One of these solutions was developed by specialists from the Center for Cardiovascular Surgery in Astrakhan, the Russian Federation.
The problem’s solution
Having thoroughly studied the practices of clinics around the world, Astrakhan specialists turned to Philips. Together with Philips, they integrated the documentation and equipment of the intensive care wards of their clinic into a single IntelliSpace Critical Care & Anesthesia (ICCA) system.
Now all information about patients of the unit is collected in an autonomous database and can be viewed by a clinic specialist at any time. The system has certain limitations; information is available to all resuscitation personnel, from nurses to heads of department, but at the same time, each specialist is assigned a correspondent level of access. Thus the huge amount of work that would fall on the doctors' shoulders is now carried out by the system on an automatic basis.
How the ICCA system works
The system connects to patient support equipment. All information coming from monitors and other devices is automatically structured. All medication dosages and physician prescriptions are saved by ICCA. Filling out medical records is no longer the responsibility of nurses; they can devote this time to patients. Senior healthcare personnel can now make serious decisions without fear of patient data being incorrect.
An additional advantage is that, thanks to the automatic data ordering, a specialist can now find specific information much more quickly. What’s important is that the system accompanies the patient from the surgery room to the ICU.
In the opinion of doctors, a huge advantage of ICCA is that it helps them track all the procedures the patient went through. A specialist can see which medications were prescribed and who made adjustments to the treatment regimen. Thus, a competent system for supervising each patient was developed.
Today, the specialists of the center can no longer imagine their work in the previous mode, without using the ICCA system. Doctors are pleased to share their positive experiences with colleagues from other institutions. Medical personnel are confident that no ICU can do without an automatic collection of patient data, which means that the implementation of the system should be encouraged and developed by all means.
At Andersen, we are also experiencing a growing need for automation from healthcare facilities. We are approached by both public organizations and private companies with tasks of varying complexity. We can either develop a comprehensive system from scratch or provide required specialists to help the customer's team. This is the main (but far from the only) advantage of outsourcing.
0 notes
Text
Using Neural Networks to Discover Antibiotics

Antibiotic resistance is one of the greatest challenges of modern medicine. More than a hundred thousand people die every year because doctors cannot treat bacterial infections. However, there is an unexpected ally in this fight for lives, which can help to solve the problem of bacterial resistance to existing drugs. This ally is neural networks. Scientists from the Massachusetts Institute of Technology demonstrated that well-trained neural networks can successfully identify new antibiotics from millions of candidate molecules.
Why many antibiotics are becoming ineffective
Simply put, the mechanism of bacterial adaptation to antibiotics can be described as follows: random mutations constantly occur in bacterial DNA, and due to their huge number, there is always a probability that some of these mutations will help particular bacteria survive in new conditions. The rest of the population might die, but the surviving ones will quickly multiply and take their place. Bacteria are unlikely to survive boiling or intense irradiation, but many of them no longer respond to antibiotics. Developing resistance requires a certain period of time, and with each year, less and less time is needed. For example, by the early 1970s, most of the gonococcus bacteria had developed high-level resistance to antibiotics of the penicillin group.
There is a simple rule – the less frequently you take antibiotics, the less likely the bacterial communities are to adapt to them. This applies both to individuals and to entire continents. Unfortunately, when most of the bacteria on the planet develop resistance to a substance, the bacteria that made you ill are likely to have this resistance too, even if you have never taken any antibiotics.
Solving the problem of antibiotic resistance requires many simultaneous efforts: reduction of their usage in agriculture, control over their sales, and monitoring resistant nosocomial (hospital-acquired) strains. But even all these efforts combined will not be very effective without discovering new substances, and this task is becoming more difficult every day.
It is well known that the first antibiotic, penicillin, was discovered by accident — a mold that produces penicillin got into a petri dish containing a bacterial culture. Nevertheless, Alexander Fleming, the author of this discovery, won the Nobel prize, and hundreds of thousands of people were saved from a number of infections (at least for a while).
But today, discovering new medicines is a demanding and costly task: all the low-hanging fruit are already picked, and scientists have to waste more and more efforts only to find already known substances. With new computer methods of analysis, the researchers are constantly trying to optimize the process of discovering new antibiotics and facilitate the endless "manual" search for different substances.
How neural networks solve this problem
Artificial neural networks have become one of the most popular methods of predicting and searching for interconnections in biological systems. Such a network simulates, to a certain extent, biological neural networks in a brain and functions as a collection of connected computational units that are able to receive input data, transmit signals to each other, and generate a response. The more complex the architecture of this collection is, the more complex the neural network is and the more complex tasks it can learn to solve.
What's interesting is that, when it comes to complex studies of genomes or other biological data, the researcher often needs not only to obtain predictions from the neural network but also to understand the stages of its learning process post factum. For instance, a neural network can find a pattern in the interaction of particular proteins and particular segments of DNA and learn to predict which new proteins will have similar properties. However, scientists will still need to figure out what exactly this discovered pattern is, as the neural network does not learn in the same way as people. It has a completely different logic and tracks the "research" in an alternative way.
Nowadays, the effective usage of neural networks in biology and medicine is just in its infancy. In a new article published in Cell, a group of MIT researchers led by James Collins said that they successfully screened millions of candidates for antibiotics using Deep Learning methods (a set of Machine Learning methods used by neural networks).
During the learning process, the neural network was trained to spot potential antibiotics among 2,335 molecules, the effect of which on the model bacterium — Escherichia coli — was well known. The chemical structure of each molecule was encoded using a set of numbers responsible for the interconnections between the atoms. The task of the neural network was to detect the motives in such structures, which were responsible for their antimicrobial activity.
Once the system learned to predict the properties of a substance based on the shape and composition of its molecule, it was granted access to several electronic chemical libraries of a much larger volume. These libraries contained more than a hundred million molecules in total, and the overwhelming majority of them had never been studied for their effect on bacterial cells.
And what is the result?
According to the article published in Cell, the patterns that the neural network can detect in the structure of antibacterial substances are enough to spot potential antimicrobial agents among different compounds. Particularly, among compounds that were predicted by neural network and scientists to have antibacterial action, more than half turned out to actually be working antibiotics.
Scientists were able to find at least one potent substance, the antibacterial properties of which had not previously been known. This was a compound that scientists called "halicin" — an understudied kinase inhibitor, previously not used as an antibiotic. A number of laboratory experiments were conducted with it, and scientists discovered that this substance is really able to inhibit the growth of a wide range of bacteria, including those strains that are resistant to most modern antibiotics. This happened due to the following mechanism (previously not used in antibiotics): halicin inhibits the proton pump activity by reducing the sensitivity of bacteria membranes to changes in pH. And since a proton pump is the most important component of the bacterial cell, it is incompatible with its vital functions.
In addition to halicin, the neural network predicted at least eight new compounds that could have antibacterial properties. In these compounds, it also found mechanisms that had never been used as antibacterial agents before; and at least two of them showed successful results during laboratory research. The scientists note that although these results look impressive, there are still many difficulties to be encountered in further studies of this type.
Why it is not so simple
This is not the first time scientists have tried to use neural networks in order to pinpoint potential medicines. In fact, they have been trying to use neural networks to select new antibiotics for more than twenty years. In recent years, training samples have become more and more complex, and the data obtained have been confirmed experimentally. For instance, a recent study published in Frontiers in Pharmacology stated that Russian scientists have managed to identify at least two new compounds with strong antimicrobial activity against several bacterial species. However, these researches all face one very significant problem - lack of data.
The neural networks need decent — large and detailed — training samples. However, the set of experimental data and knowledge of all the subtleties of the structure and activity mechanisms of various substances is still limited. In addition, you cannot say which aspects will be important and which won’t. You will not be able to test everything because it is quite expensive, and the neural network itself is often too optimistic. For example, when it screens a million candidates, it offers hundreds of options that are good from its point of view.
Six months ago, Canadian scientists taught the neural network to predict the probability of how often patients with urinary tract infections are prescribed an antibiotic that will actually help them. They trained the neural network using patients' clinical data from a nine-year period to check whether it would correctly predict what would happen in the tenth year. It was found that in 8.5% of cases the physicians had prescribed an ineffective antibiotic. Moreover, if choosing medicine at random, they would be mistaken almost as often - in 10% of cases. The neural network recommended the ineffective antibiotic only in 5% of cases, which is an improvement of course, although not substantially. However, it should be noted that this study was still quite empirical. The nature of the data on clinical history, the patients' response to certain antibiotics, as well as the specifics of certain populations of people impose great restrictions, and you cannot blindly follow the advice of the neural network and use this information in medical practice without additional checks.
In order to narrow the number of new antibiotics candidates, the authors of the study published in Cell focused on similarities of candidate molecules with known antibiotics (despite the fact that their general mechanism of action, as was mentioned above, could be quite different), as well as introduced, for example, restrictions on the potential substance toxicity to humans. But the more such clarifications and rules are introduced, the more it looks like a rollback to manual data processing, and scientists want the computer to do everything on its own.
Nevertheless, the researchers are optimistic. They believe that improving methods and training samples (e.g. more clearly dividing them into groups based on their influence on bacteria and on how they interact with different types of substances), as well as developing new neural network algorithms will bring new projects increasingly fruitful results.
0 notes
Text
Five Basic Cybersecurity Solutions that will Suit any Company

Director of Verizon Lowell McAdam once said that any company would sooner or later become a victim to hackers, no matter how hard it tried to ensure cybersecurity. McAdam has his reasons for such an opinion. In 2013, three months after the acquisition of Yahoo, three billion accounts were at risk due to a single hacker attack.
However, the inevitability of an attack doesn’t mean that no protective measures need to be taken. After all, one hack in the entire history of a company is much better than ten hacks or a hundred hacks (though hardly any company, even a large one, can survive this). Although hackers’ activity is growing, and incidents are more frequent, many companies continue to ignore the danger.
According to a survey conducted by the UK government, 68% of business leaders have not received any training on how to mitigate information security vulnerabilities. Perhaps the reason for this is in the reluctance of managers to undertake significant expenses without being sure about the result. What can an average mid-sized company hope for if even tech giants can’t fully secure themselves from attacks?
Indeed, many cybersecurity systems are expensive. However, basic security tools built into the OS or free additional software are enough to protect you against the overwhelming majority of malicious programs. In addition, there are many budget solutions for corporate security. We'll consider five of the most popular ones that will suit any company.
1. Antivirus software
Malware detection software can be expensive, but companies like Kaspersky or McAfee offer budget solutions for small businesses. An antivirus subscription for 20 devices will cost about $100-150 per month. For larger companies, there are more complex and expensive solutions, the price for which varies between $500-1000 per month. Given the fact that large companies experience damage from hacker attacks to the extent of around tens of thousands of dollars every month, such costs look quite reasonable.
2. Training and instructing employees
Data leakage often occurs even without the intervention of cybercriminals - solely due to the negligence of employees. This can be a laptop or phone left unattended, an account logged into on someone else's computer, etc. According to a Willis Towers Watson study, about 70% of leaks are employees’ fault. Such portals as StaySafeOnline.org, Social-Engineer.com, etc. provide articles, online training, and useful information for employees and managers on how to significantly reduce the vulnerability of company data using simple measures. If you want full-fledged training, you can order a realistic simulation of a DDoS attack and organize interesting and useful gamification for your employees.
3. Network performance and web security
In modern business, one can hardly find a company that doesn’t have a website. Meanwhile, every website needs protection from attacks that can lead to its disruption or data losses. For this purpose, you can use budget solutions like those offered by Cloudflare or Incapsula. They provide both basic free versions and advanced paid ones, including those with additional functions such as CAPTCHA checks, bot blocking, etc.
4. Personal data protection services
Companies have to defend themselves not only from attacks on their intellectual property but also from common fraud. If employees receive an email from a manager’s corporate address with a request to transfer a certain amount of money to a specified account (for example, to help a colleague in trouble), many are ready to do it without asking questions.
According to the FBI, the damage from such fraud has increased significantly over the past few years. Fraudsters are becoming more skillful and not just creating mailboxes that look like corporate accounts but hacking the real ones. You can protect your company from such a threat for free, by introducing strict rules for correspondence between employees, or for a price by purchasing a protection package from Experian or Lifelock costing around $150 per month. Such services not only provide additional protection against hacking of corporate mail but also help to eliminate its consequences in the shortest possible time.
5. Mobile applications
Mobile devices are actively used in business processes, so having a solution to protect the data passing through them is vital. There are special app managers that generate and manage complex passwords. End-to-end encryption applications such as Signal help to protect calls and messages, and apps like Keeply provide the opportunity to store data in a secure container.
At Andersen, we pay great attention to cybersecurity and provide consultation to our clients. During the period of lockdown and active transfer to remote work, we helped some of our current clients adapt to new conditions and organize secure processes. Since then, interest in corporate security services has only been growing, and in a short time, cybersecurity has become an important area for our company.
0 notes
Text
Three Reasons To Win The Testing Trophy
In 2018, Kent C. Dodds put a new testing model up against the well-known pyramid. He gave prominence to the integration testing, which formed the biggest part of that shape. Dodds assumed that this is the layer where key benefits are concentrated - such as reliable results at relatively low expenses within a short time. This model was called a testing trophy.
Even taking into consideration all the pros of such a concept, how to put it into practice? In what way can one increase the number of integration tests? Let’s throw some light on the matter.
Jeff Nyman proposed to enhance test simplicity in integration tests and lay stress on production faithfulness earlier in the testing process. He brought into focus the importance of a design that clears up the intent and helps to make decisions in situations where there are multiple choices.
Dodds recommended two ways to solve this issue:
1. When testing, one shouldn’t pay too much attention to every single piece of code at the unit level. The more tests you have here to cover the code, the less sure you feel about the compliance of the app with the requirements. In other words, as code coverage gets above 70%, the benefits of tests start to decrease. For example, while paying heed to the rare cases that are unlikely to have much impact, one is in danger of postponing the deployment dates.
One should ensure the correctness of units at the integration stage. If the specific integration test checks the same elements that the unit test does, then successfully passed integration tests indicate that those units the modules consist of are also correct.
Are you curious to find out more? Learn about the features of the above testing approach in this article on our website.
0 notes