vbhattad
vbhattad
Virendra Bhattad
8 posts
Some thoughts on future of technology
Don't wanna be here? Send us removal request.
vbhattad · 5 years ago
Text
Demand planning and forecasting in times of Covid-19 for food and beverage industries
Tumblr media
Food and beverage industry will likely undergo disruption and change now in near future. Most  will want to understand supply chain transparency now.
 How my food was prepared and how can I guarantee that it was hygienically prepared?
What are quality controls in place in a production factory?
How my company can provide data for food transparency and also increase production at the same time?
Can my industry meet new regulatory requirements in light of COVID-19?
Most of such data is currently silo’d and proprietary. Some of solutions include
Better forecasting and demand planning in light of uncertain and changing scenarios
Production planning and scheduling as required based on forecasting
Shelf-life and seasonality management for uncertain times
Integrated label compliance for consumers who will now demand
Advanced product lifecycle management
End-to-end traceability
 Do you think technology solutions above will help in achieving this? How can these be achieved for small and medium sized businesses that are severely impacted by new health crisis?
0 notes
vbhattad · 7 years ago
Text
Reasons to Consider implementing API Gateway
Talking about advent of multiple channels that today’s enterprises and applications needs to support, I will discuss about API Gateway Pattern than can help enterprises to become API-centric. For example, you may need to orchestrate aggregated data from various backed services. Take an example of a company needing to develop multiple versions for its dashboard pages based on following requirements:
Responsive UI for desktop and mobile browsers
Native mobile clients which can interact via RESTful APIs
For an online eCommerce store which is using microservices architecture, data can retrieved via multiple services in various applications. Every service that is decomposed can be running independently with a different set of data and can have backed applications as microservices.
Product Information Service
Inventory Service
Price List Service for each country
Ordering Service
Order Confirmation Service
Shipment Details Service
Ultimately, the front-end channels need to fetch information from all of these back-end services. Various channels or applications in the front-end need to call multiple open API’s to combine and display its data.
Problem
How do the clients of microservices-based application access these independent services? In above case, every channel will need to have logic to call multiple backend APIs. Since data is transferred over the http, it may result in network congestion and redundancy. Also, security requirements for each client application can be different.
Solution
We will be using an API gateway which can provide following benefits in this architecture:
API Lifecycle Management
Payload Modeling and Transformation- This provides for developers to modify the JSON schema during the request and response cycles
API Logging, Caching, Throttling, Bursting and Monitoring capabilities
Fewer API requests. With less trips, fewer requests are made resulting in less overhead and improving the user experience.
API Facade can provide advantages of not knowing complete implementation details
Provides for simplification by reducing logic to call multiple API’s at once.
Increased security by providing for implementation of client security at a single point of entry for APIs
Other advantages include protocol conversion, transformation of data, load balancing, intelligent routing and Service level monitor capabilities.
Other Considerations
Please consider below before API gateway pattern can be used:
Increased layer for managing API’s - The API gateway must be developed, deployed and monitored
Depending on network configurations, can result into increased response time due to additional layer. But for most, this can be insignificant.
Here is simplistic view of API gateway implementation.
Tumblr media
0 notes
vbhattad · 8 years ago
Text
Identity and security architecture for hybrid organisations
In the latest security breach, Deloitte has been targeted in an attack that compromised their emails along with plans of some of their clients. This incident is strong affirmation that identity is now at the center of information security. Even skilled security practitioners and architects are struggling with such recent incidents. We now have highly hybrid organisations with on-premise and cloud infrastructure working together. This is also one of the main reasons for such breaches. As the speed to embrace cloud computing may not have been accomplished with detailed architecture considerations for complex organisations. Identity and federated security is the glue that binds everything together for hybrid organisations now.
So, organisations should be rethinking their approach to identity security. The password is long gone as means to secure identity and even two-factor authentication is not enough. Here are some of the suggestions to raise the bar:
Provide adaptive access control methods.
Add extra layers of protection via IP listing and IP monitoring.
Review user behaviour analytics and block unusual attempts
Utilise machine learning and artificial intelligence to identify threats before they occur.
Encourage encryption of emails where information is very sensitive.
Add archiving and other similar features based on business rules such as erase anything older than a certain amount of years.
Even beyond, leveraging Blockchain technology can provide fundamentally different approach to cybersecurity. This can go beyond endpoints and can include user identity security.
These paradigm shifts can also provide capabilities such as transparency, audit trails that will enable us to make the most of hybrid strategy, while eliminating such potential security threats. Entire gamut of organizations will will have an excellent chance to learn from Deloitte’s and also Equifax’s email/security breaches. This will allow them and others like me share their views and possible protective measures/new processes. Definitely this will be a good lesson learned but not without a cost.
0 notes
vbhattad · 8 years ago
Text
Hybrid integrations for SaaS products of next generation
Integration is key to every SaaS product and not all are created equal. For many SaaS based startups, providing value in their product comes from leveraging ecosystem around them. Next generation of integrations are hybrid leveraging cloud, on-premise, human and machine data seamlessly. There are many possible integration approaches that I can think of:
Native Integration — Integrations that are developed using simple request/reply implementation such as in JMS. Here requestor application sends a request, a replier application receives the request and returns a reply. Finally, the requestor receives the reply. Invalid messages are routed to special channel. Native integrations are the most valuable as the quality is typically higher and the SaaS company is committed to maintain them.
Overlay Integrations — These are integrations that can modify and enhance the functionality by overriding the user interface of a third-party app. These are UI overlay extensions. A common example would be Chrome extensions or Firefox extensions, that block advertisements.
Middleware Integrations — Integrations that are provided and maintained by a third-party integration platform to connect two disparate apps are middleware integrations (e.g. Tibco or Mulesoft). Middleware provides host of functionalities including smart decisions, business process orchestration, access to API, operational insights, inteoperability and ability to rapidly prototype services and business composite applications. Middleware integrations can be more time consuming and expensive depending on the APIs, adapters of the products being connected.
When devising product roadmap for integration, it’s important to understand that there are various options and each serves its own purpose.
Are there any other integration approaches that come to your mind for SaaS products?
0 notes
vbhattad · 12 years ago
Text
Becoming social and smart so you exist
In this blog, I am talking about Machine-to-machine (M2M) communication, wireless innovation, social network of myriad things for industrial and manufacturing sector. I think this sector is ripe for innovation in the coming decade or so.
Last decade has continued to see evolution of social networks and cloud computing. First we saw advent of ASP’s which are now transformed into SaaS and are leveraging benefits of cloud computing. We have moved into a world where people from across the globe are connected seamlessly. After computers and people, its now obvious that machine to machine communication is making social more relevant. Smart appliances already send all sorts of data such as service repairs, up time and alerts about when they need to be serviced. I remember excited faces of my kids when they started watching smart TV and myriad of smart applications enabled on it.  Lots of research is being done by companies on novel sensing solutions and ubiquitous computing. Simple sensors to detect activities within residences will definitely thrive and we have innovators like Shwetak Patel to thank for. Similar innovations will benefit machinery industry detect relevant activities based on network of things. 
The real change will begin when products are social and can seamlessly talk with networks, machines and people.  So social networks of tomorrow will evolve to be local and network of myriad things. It can be a fire alert sending instructions to assembly line automatically within a manufacturing facility. Behavioral safety will have new meaning and it will surely make for a more productive place.
Why is this now possible: Definitely inexpensive data/bandwidth compared to decade ago, availability of wireless networks, robust sensors, multiple mobile enabled Operating systems and advent cloud computing are enabling the change.   The power to harness data and computing resources based on machine commands means that they don’t need human intuition or memory to operate.  Cloud of things can be leveraged.  Tomorrow’s product will be able to easily leverage these inexpensive technologies and will provide mashups for other devices.
Some of the examples of how traditional manufacturers are outsmarting their competitors via smart technologies include:
Caterpillar has released Equipment preventative maintenance schedule interface that their dealers can use to upload data to its Equipment Care Advisor (ECA).  Equipment Care Advisor combines equipment information and Cat dealer expertise to help customers to make informed maintenance, repair or component replacement decisions. Another module Equipment Condition Monitoring would allow collection of routines that facilitate early detection.
Creating socially smart platform for your products
To leverage these advances, most things cannot be done alone. You need social network of collective things, platforms to analyze data (such as CAT’s ECA), mechanism to leverage big data in a relevant fashion.
Many Smart Phone apps have leveraged an existing connected product and also an existing collaborative platform (Facebook, Foursquare, Twitter) to create a social network of connected things.
Garmin has created Garmin connect platform where you can see user’s activity on a map, view lap splits and explore activities from other users.  Plus there is ability to analyze data which are automatically uploaded using Cloud.
Social, Smart and Local is a new mantra for new successful companies.  Companies that will create such products will outsmart their competitors. Monetization of such strategies will be easy and customers will also get good value in terms of productivity and quality.
Finally here is what Garner has to say on Smart Machines
"Through 2020, the smart machine era will blossom with a proliferation of contextually aware, intelligent personal assistants, smart advisers (such as IBM Watson), advanced global industrial systems and public availability of early examples of autonomous vehicles. The smart machine era will be the most disruptive in the history of IT. New systems that begin to fulfill some of the earliest visions for what information technologies might accomplish -- doing what we thought only people could do and machines could not --are now finally emerging."
Thanks for reading,
Virendra Bhattad
0 notes
vbhattad · 12 years ago
Text
ERP in the Cloud – Various options for organizations with onPremise ERP
As many IT services are moving to online offerings in the cloud, many IT executive are considering adoption of cloud for their enterprise resource planning (ERP) systems as well.
In the consumer centric environment, you can use applications in the cloud such as Google’s online documents or Google Drive; watch movies from Netflix. Although some IT organizations have succeeded in moving a few modules within ERP services, such as CRM, HR into the cloud, many CIOs have their own doubts for core financial and supply chain operations. 
In this article, I will address some of the factors that executives should consider for leveraging cloud based services for their ERP systems. In fact I have prepared a framework to arrive at more specific answers. In a nutshell, this framework considers the area of business, complexity of integration within an organization, existing ERP solution, security and company culture to name a few. The framework also considers each aspect of cloud computing service to be migrated such as Infrastructure as a Service, Platform as a Service, Software as a Service and Integration as a Service.
I will discuss various options within organizations which are leveraging on-premise ERP solutions, as it relates to Cloud adoption:
1) Existing Vendor’s Own Cloud Based ERP Solution- For large organizations, many traditional ERP vendors are now offering cloud based solutions. These are gaining some acceptance due to large ownership costs associated with onPremise ERP. But these may not be hybrid or private cloud offerings and can compromise your organizations security standards. These solutions include SAP’s Business ByDesign, Oracle’s Fusion, Microsoft Dynamics running on Windows Azure and Infor’s M3 offered as a public cloud offering on AWS platform. Biggest drawback for these solutions are lack of deep functionality or availability of standard integration adapters for complex integration scenarios.  Invariably these offerings will also tie you into vendor locking. These solutions are typically geared towards small and medium enterprises.
It is also worthwhile to look into how a cloud based offering is developed by vendor in the first place. Some vendors have developed cloud based solutions from scratch, others have configured their existing ERP solutions for enabling cloud. For example, SAP’s Business ByDesign which is developed from scratch covers Integrated suite with financials, HR, sales, procurement, customer service, and supply chain management. While Microsoft’s Dynamics is adapting to its own Windows Azure cloud offering. Infor is trying to leverage Amazon AWS IaaS for its own ERP solutions typically for small scale clients. It is worthwhile to note that none of the existing vendors have been able to develop a new Cloud offering which will have better market adoption than their traditional ERP solution.
2) Migration of existing onPremise ERP into Hybrid/Private Cloud - For large organizations, migration of existing onPremise ERP into cloud (leveraging private cloud solutions) can be one of the better option except for the costs associated with such an exercise. This can be cost beneficial though if a consortium of various organizations can leverage such a solution. Also if a vendor specializing in cloud offerings enables a consortium with private cloud offering. For example, I see that organizations specializing in data center operations such as Telus taking a lead and providing such offerings to its clients. 
There can be many benefits for phased migration approach to private cloud as it requires minimal business disruption in terms of existing processes or existing integration. Some design level changes can be imperative and also consideration to existing software’s ability to be deployed into Cloud. Being a complex scenario where one size fits all does not apply, I would think that business buy in is a must for this.
3) Migration of existing onPremise ERP into a new Cloud based ERP solution - For large organizations, migration of existing onPremise ERP into totally new cloud based solution can avoid all the headaches of traditional migration. It can also be a cost saving exercise in the long run provided new cloud solution covers all essential functions and existing staff can be retrained quickly to use the new ERP. Vendors such as Netsuite are leading the charge for small and medium scale organizations. At this time, the majority of cloud ERP installations are in smaller companies because they have basic functional requirements, typically don’t have a large investment in IT infrastructure, and have relatively few users. This is changing as acceptance moves up-market and larger companies implement cloud ERP solutions.
The cloud vendors are able to implement their solutions faster than on-premises vendors because their solutions are simpler and they don’t usually offer the sophistication or flexibility that on-premises vendors do. These cloud vendors may not be ready to handle complex integration scenarios but over the period their solutions can be mature enough for some industries.
What’s the answer
Is migration of ERP to cloud right for your organization? This depends on your company culture, resources, requirements, IT infrastructure, IT integration and the total cost of ownership at your organization. Cloud-based technology solutions require companies to loosen their control of critical data. Companies must take a comprehensive approach to the risks, from both the business and the IT security perspectives.
In my software consulting experiences, I’ve rarely come across a situation where one-size-fits-all works. But I hope this article helps you or your organization to delve into various options and arrive at a right roadmap.
0 notes
vbhattad · 12 years ago
Text
Integration Architecture for tomorrow's enterprises
Come to the world of Google, Twitter, Facebook and Linkedin. You must have also heard buzzwords personalization, gamification, SOA, simplicity and security. Suddenly integration architecture comes to the forefront for business transformation. Having looked at large enterprises and how their integration efforts have evolved in the last 10 years or so, I can foresee integration moving towards service-oriented and cloud-based architecture. Integration architecture acquires new vision and discipline.
Enterprise architects are proposing lighter-weight infrastructure to support more complex integration projects. New services are being influenced by API landscape which is increasingly REST-based, service oriented and mobile enabled. So most important integration principles that should be adopted for new transformations:
Simplicity in ESB designs
An enterprise service bus (ESB) is a key enabler for service-oriented architecture (SOA). ESBs have really increased in popularity over the years, along with new modules developed by vendors for various problems. There are now multiple solutions for business optimization, data governance, business process management, service oriented architecture, orchestration and cloud enablement. But many times ESB tools are being implemented for solving wrong problems or in a flawed manner. For example, when ESB starts processing application logic, it gets bad name. Functionality of application server will never be complemented by ESB solution/infrastructure. In the crux, special attention should be paid to keep the solution simple and leverage modules that solve the underlying problems. This will also ensure that your ESB passes underlying benchmarks for performance and security.
Appropriate use of ESB infrastructure
Middle tier (the ESB) provides routing and mediation, of messages passed between service requesters (consumers) and service providers. An ESB provides way to decouple requester view of service, from the implementation of a service. An ESB also provides decoupling the business view of a solution from the technical service interactions. So, most appropriate use of for ESB will be to align long term strategic goals to underlying implementation. Best use case scanarios for ESB then are for message routing, master data management, translation and achieving decoupling. Implementing appropriate modules for solving underlying problems are crucial to using the tool correctly. Otherwise, enterprises will have an ESB infrastructure that can be costly and difficult to maintain. I just correlate this to tremendous IT waste I have seen due lack of foresight or vision in selection, design and overall principles.
Governance for integration architecture
Pay special attention to finalize governance in a global organization. REST patterns have been found to be easier to understand and provide underlying governance model which is developer friendly. REST use has been adopted by web giants like Google, Facebook, Twitter and Linkedin. An effort must be made to provide proper governance for your integration architecture efforts. If you are working in a B2B environment, your overall API pattern must correlate with availability of protocols from your vendors such as SOAP or REST.
Gamification for enterprise integration efforts
Games and game technologies now increasingly extend their traditional boundaries, as evidenced by the growth of serious and pervasive games. Evident by success of location-based services such as Foursquare, this design approach has rapidly gained traction in interaction design and digital marketing. So today some enterprise application experts are also adopting gamification for integration of technologies such as Big Data and Cloud. Development shops are creating game concepts to keep development community engaged and overcome major hurdles to integration architecture success.
0 notes
vbhattad · 12 years ago
Text
Big Data Revolution: How to change your servicing model
We are witnessing a huge shift in how we connect, socialize, communicate and manage customers. This has to be taken into cognizance due to advent of Big Data. Users have access to data on the go via various devices and platforms. Each platform boasts of user base which can bring fundamental changes in many landscapes and policies. We have had major political paradigm shifts in Middle East. We also witnessed many global protest movements across the world all aided by availability of data and advent Big Data thereof. This forces us to change the IT architectures in no small scale to reap the benefits of Big Data. Any strategy should obviously consider mission statement for each organization. This should then be aligned by business and architecture vision. In my day to day work, people are always asking me about guiding principles that can bring the required change for advents in Big Data.
Here are various pillars for guiding your architecture and product management for the new world:
Personalization - Classifying and retrieving data relevant for each user. Keeping relevant data when user switches from platform. Providing the personalization will wow the user experience for your business. Users would like simplicity and yet powerful technology behind the business driving the servicing model.
Responsive Designs- Providing consistent experience when users have performed a set of functions at any location, device, systems or platforms. Imagine a user who has initiated a purchase transaction on mobile and willing to fulfill this order at his home, he should get responsive and consistent URL and experience. 
Social and Live feedback- To understand your customers better, provide them tools to share and submit feedback socially. Be open to provide your own reviews and opinions as well, specifically mentioned via your signature. Open communication will lead customers to embrace and share your brand and also their own experience. This will enable a better 360 degree review for your offerings to drive future roadmap for your product development.
Real time data processing- Take advantage of complete data processing systems. Providing various layers for such complex data processing via the batch layer, the serving layer and the speed layer will help to replace batch based data processing to real-time data processing esp. via Hadoop as a framework.
0 notes