Tumgik
#aws lambda eventbridge
codeonedigest · 1 year
Text
AWS Lambda Compute Service Tutorial for Amazon Cloud Developers
Full Video Link - https://youtube.com/shorts/QmQOWR_aiNI Hi, a new #video #tutorial on #aws #lambda #awslambda is published on #codeonedigest #youtube channel. @java @awscloud @AWSCloudIndia @YouTube #youtube @codeonedigest #codeonedigest #aws #amaz
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources for you. These events may include changes in state such as a user placing an item in a shopping cart on an ecommerce website. AWS Lambda automatically runs code in response to multiple events, such as HTTP requests via Amazon API Gateway, modifications…
Tumblr media
View On WordPress
0 notes
ho2k-com · 2 months
Text
0 notes
markwatsonsbooks · 3 months
Text
Tumblr media
AWS Ultimate Guide: From Beginners to Advanced by SK Singh
This is a very comprehensive book on AWS, from beginners to advanced. The book has extensive diagrams to help understand topics much easier way.
To make understanding the subject a smoother experience, the book is divided into the following sections:
Cloud Computing
AWS Fundamentals (What is AWS, AWS Account, AWS Free Tier, AWS Cost & Billing Management, AWS Global Cloud Infrastructure (part I)), IAM, EC2)
AWS Advanced (EC2 Advanced, ELB, Advanced S3, Route 53, AWS Global Cloud Infrastructure (part II), Advanced Storage on AWS, AWS Monitoring, Audit, and Performance),
AWS RDS and Databases (AWS RDS and Cache, AWS Databases)
Serverless (Serverless Computing, AWS Integration, and Messaging)
Container & CI/CD (Container, AWS CI/CD services)
Data & Analytics (Data & Analytics)
Machine Learning (AWS ML/AI Services)
Security (AWS Security & Encryption, AWS Shared Responsibility Model, How to get Support on AWS, Advanced Identity)
Networking (AWS Networking)
Disaster Management (Backup, Recovery & Migrations)
Solutions Architecture (Cloud Architecture Key Design Principles, AWS Well-Architected Framework, Classic Solutions Architecture)
Includes AWS services/features such as IAM, S3, EC2, EC2 purchasing options, EC2 placement groups, Load Balancers, Auto Scaling, S3 Glacier, S3 Storage classes, Route 53 Routing policies, CloudFront, Global Accelerator, EFS, EBS, Instance Store, AWS Snow Family, AWS Storage Gateway, AWS Transfer Family, Amazon CloudWatch, EventBridge, CloudWatch Insights, AWS CloudTrail, AWS Config, Amazon RDS, Amazon Aurora, Amazon ElatiCache, Amazon DocumentDB, Amazon Keyspaces, Amazon Quantum Ledger Database, Amazon Timestream, Amazon Managed Blockchain, AWS Lambda, Amazon DynamoDB, Amazon API Gateway, SQS, SNS, SES, Amazon Kinesis, Amazon Kinesis Firehose, Amazon Kinesis Data Analytics, Amazon Kinesis Data Streams, Amazon Kinesis ECS, Amazon Kinesis ECR, Amazon EKS, AWS CloudFormation, AWS Elastic Beanstalk, AWS CodeBuild, AWS OpsWorks, AWS CodeGuru, AWS CodeCommit, Amazon Athena, Amazon Redshift, Amazon EMR, Amazon QuickSight, AWS Glue, AWS Lake Formation, Amazon MSK, Amazon Rekognition, Amazon Transcribe, Amazon Polly, Amazon Translate, Amazon Lex, Amazon Connect, Amazon Comprehend, Amazon Comprehend Medical, Amazon SageMaker, Amazon Forecast, Amazon Kendra, Amazon Personalize, Amazon Textract, Amazon Fraud Detector, Amazon Sumerian, AWS WAF, AWS Shield Standard, AWS Shield Advanced, AWS Firewall Manager, AWS GuardDuty, Amazon Inspector, Amazon Macie, Amazon Detective, SSM Session Manager, AWS Systems Manager, S3 Replication & Encryption, AWS Organization, AWS Control Tower, AWS SSO, Amazon Cognito, AWS VPC, NAT Gateway, VPC Endpoints, VPC Peering, AWS Transit Gateway, AWS Site-to-Site VPC, Database Management Service (DMS), and many others.
Order YOUR Copy NOW: https://amzn.to/4bfoHQy via @amazon
1 note · View note
sophiamerlin · 11 months
Text
Preparing for Success: A Comprehensive Approach to the AWS Certified Developer Associate Exam
The AWS Certified Developer Associate Exam is a crucial step in establishing your expertise in developing applications on the AWS platform.
If you want to advance your career at the AWS Course in Pune, you need to take a systematic approach and join up for a course that best suits your interests and will greatly expand your learning path.
Tumblr media
Aims to provide you with a comprehensive guide on how to pass the exam successfully.
Some Steps to Increase Your Chances of Achieving the Certification:
Understand the Exam Blueprint: To start your preparation, familiarize yourself with the official AWS Certified Developer Associate Exam Guide. This document outlines the exam domains, subtopics, and their weightage. It serves as a roadmap for your study plan, enabling you to focus on the key areas that will be assessed.
Gain Hands-on Experience: Practical experience is paramount for success in the exam. Create an AWS Free Tier account and immerse yourself in working with various AWS services. Build and deploy applications using services such as EC2, S3, Lambda, DynamoDB, and API Gateway. The more hands-on experience you gain, the better you’ll understand the services and their integration.
Study Relevant AWS Services: The exam will test your knowledge of various AWS services. Develop a strong understanding of their features, use cases, and best practices. Some of the key services to focus on include EC2, ECS, Lambda, S3, EBS, RDS, DynamoDB, VPC, Route 53, CloudFront, SNS, SQS, Step Functions, EventBridge, CloudFormation, and the AWS SDKs. Dive into the AWS documentation, whitepapers, and FAQs to deepen your knowledge.
Review AWS Identity and Access Management (IAM): IAM is a critical aspect of AWS security and access management. Understand IAM roles, policies, users, groups, and permissions. Learn how to grant appropriate access to AWS resources while adhering to the principle of least privilege.
Familiarize Yourself with AWS SDKs and Developer Tools: Gain knowledge of the AWS SDKs available for various programming languages. Understand how to use them to interact with AWS services programmatically. Additionally, explore developer tools like AWS CLI, AWS SAM, and Cloud Formation for infrastructure-as-code deployment.
Practice with Sample Questions and Practice Exams: Utilize official AWS sample questions and practice exams to assess your knowledge and identify areas that require further study. This exercise will also help you become familiar with the exam format and improve time management skills.
Tumblr media
If you want to learn more about AWS Certification Online, you contact best institute because they offer certifications and job placement opportunities. Experienced teachers can help you learn better. You can find these services both online and offline. Take things step by step and consider enrolling in a course if you’re interested.
7. Explore the AWS Well-Architected Framework: The AWS Well-Architected Framework provides architectural best . for building reliable, secure, efficient, and cost-effective AWS applications. Understand the five pillars of the framework (Operational Excellence, Security, Reliability, Performance Efficiency, and Cost Optimization) and how they apply to AWS services.
8. Enroll in Online Courses and Training: Consider enrolling in online courses specifically designed for the AWS Certified Developer Associate Exam. Platforms like Udemy, A Cloud Guru, and Linux Academy offer comprehensive courses taught by experienced instructors. These courses can provide structured learning and help you grasp complex concepts effectively.
9.Join Study Groups and Discussion Forums: Connect with fellow exam takers and AWS professionals in study groups or online forums. Engaging in discussions, sharing resources, and learning from each other’s experiences can significantly enhance your understanding and provide valuable insights.
10. Review Exam Readiness Resources: Before the exam, review the official exam readiness resources provided by AWS. These resources include exam guides, whitepapers, and FAQs related to the exam topics. Ensure you are aware of any updates or changes to the exam content.
Passing the AWS Certified Developer Associate Exam requires a combination of theoretical knowledge and practical skills. By following this comprehensive guide, dedicating sufficient time for preparation, and staying updated with the latest AWS services and best practices, you can increase your chances of success.
Remember to practice regularly, seek support from study groups, and leverage the available resources.
Good luck on your journey to becoming an AWS Certified Developer Associate!
0 notes
metamoonshots · 11 months
Text
Web3 gaming platform Immutable not too long ago introduced a partnership with Amazon Net Providers (AWS). This collaboration is ready to revolutionize the world of blockchain gaming, ushering in a brand new period of potentialities for sport builders within the crypto house. Immutable Joins AWS’s ISV Speed up Program In line with an Oct. 10 statement, Immutable has not too long ago joined AWS’s ISV Speed up Program, a gross sales program designed for firms leveraging AWS providers of their merchandise. Immutable’s blockchain gaming know-how, significantly Immutable X, which is Ethereum-compatible, will now be seamlessly built-in with AWS’s Activate program. #Immutable 🤝 @amazon Amazon Net Providers and Immutable are working collectively to form the way forward for gaming! Via our collaboration with Amazon, we are going to achieve entry to an unlimited pipeline of sport studio leads, help for profitable deal closures, and as much as $100k in AWS cloud… pic.twitter.com/SX7xfFqrtK — Immutable (@Immutable) October 10, 2023 Immutable’s Chief Industrial Officer, Jason Suen, expressed pleasure concerning the partnership, stating, “By becoming a member of AWS ISV Speed up and AWS Activate applications, we’re capable of present our huge community of sport builders with a turnkey resolution for shortly constructing and scaling web3 video games.” AWS Head of Startups, John Kearney, highlighted the importance of the gaming sector throughout the blockchain trade. He famous that quite a few studios are exploring crypto know-how’s integration into their video games and expressed anticipation for the long run on this dynamic subject. AWS can also be actively engaged with the Blockchain Sport Alliance, showcasing its dedication to advancing blockchain gaming. AWS, recognized for its cloud computing platforms and knowledge storage providers, affords numerous providers tailor-made to the gaming sector. These embrace cloud gaming providers, sport servers, sport safety providers, sport analytics, online game synthetic intelligence, and machine studying choices. AWS’s Position in Immutable’s Growth Immutable’s collaboration with AWS will open up an unlimited pipeline of sport studio leads and supply vital help for deal closures. This partnership affords the potential to simplify the expansion and enlargement of blockchain video games by offering entry to AWS assets, which can improve safety for potential purchasers and, finally, facilitate the closing of agreements with distinguished sport studios globally. Immutable’s platform is constructed on Amazon EventBridge and AWS Lambda, harnessing serverless providers that use occasions to attach software parts. This structure empowers the platform to scale successfully, dealing with a 10x enhance in partnered video games. Along with its collaboration with AWS, builders looking for to construct on Immutable’s blockchain also can make the most of AWS Activate, a program providing important perks equivalent to technical help, coaching, and a considerable $100,000 value of AWS cloud credit. In the meantime, Immutable has been making strides with its zero-knowledge Ethereum Digital Machine (zkEVM), which underwent public testing in collaboration with Polygon Labs in August. The zkEVM guarantees to decrease growth prices for sport builders whereas offering the safety and community results related to the Ethereum ecosystem. Immutable has additionally gained recognition and valuation, reaching $2.5 billion in March 2022, following a outstanding $200 million Collection C funding spherical primarily led by Singaporean state-owned funding agency Temasek. SPECIAL OFFER (Sponsored) Binance Free $100 (Unique): Use this link to register and obtain $100 free and 10% off charges on Binance Futures first month (terms).PrimeXBT Particular Provide: Use this link to register & enter CRYPTOPOTATO50 code to obtain as much as $7,000 in your deposits.
0 notes
ailtrahq · 1 year
Text
Leading Web3 gaming platform Immutable has announced a partnership with Amazon Web Services (AWS) in a bid to boost the Web3 gaming ecosystem.  According to the blockchain gaming firm, the partnership would allow it to access a pipeline of game studio deals and support for successful deal closures.  Immutable AWS Partnership  Immutable announced the news about the partnership on X (formerly Twitter). The announcement stated that both Immutable and Amazon Web Services are working together to shape the future of gaming.  “Amazon Web Services and Immutable are working together to shape the future of gaming! Through our collaboration with Amazon, we will gain access to a vast pipeline of game studio leads, support for successful deal closures, and up to $100k in AWS cloud credits per Immutable customer. When data from the cloud is validated on Ethereum, a viable model for blockchain gaming gets realized. Immutable has joined Amazon’s ISV Accelerate Program! This co-sell program allows us to access expert resources from AWS to help secure prospective customers and ultimately close deals with major game studios worldwide.” According to Immutable’s blog post, published on the 10th of October, Immutable revealed that Amazon Web Services had added it to a list of companies within its ISV (independent software vendors) Acceleration Program. The program allows companies to provide software solutions that can integrate with or run on Amazon Web Services. Furthermore, developers looking to build on the Immutable blockchain can join AWS Activate. AWS Activate provides developers with perks such as technical support, training, and $100,000 worth of Amazon Cloud Credits.  The partnership will provide a significant boost to Web3 gaming. Web3 gaming is a fast-growing trend that uses blockchain technology to give players true ownership of in-game assets. Players can then trade these assets with other players. Nearly 100 million gamers are estimated to take to Web3 gaming over the next couple of years.  AWS Committed To Expanding Web3 Gaming  John Kearney, the AWS head of Startups Australia, stated that Immutable is an excellent example of a local Australian startup that has gone global. Kearney also reaffirmed Amazon Web Service’s commitment to help expand Web3 game development using its existing infrastructure. Immutable has been built using the Amazon EventBridge and the AWS Lambda. These are serverless services that use events to connect application components. This has allowed the platform to significantly boost scalability to handle a 10x increase in partnered games. The blog post stated,  “Built with Amazon EventBridge, a serverless service that uses events to connect application components together, and AWS Lambda, a serverless compute service, Immutable’s serverless architecture has allowed the platform to scale effectively to support its rapidly expanding product suite. Through these services, Immutable has increased its scalability to handle a 10x increase in partnered games and improved reliability to enhance the customer experience through increased security and over 99 percent uptime.” Centralization Concerns About Gaming On Ethereum  There have been several concerns about the centralization of gaming and Ethereum, along with the over-reliance on Amazon. Amazon is a market leader and has captured around a third of the cloud services market. According to a report published in 2022, a majority of the active Ethereum nodes were running through centralized web providers such as Amazon Cloud Services.  However, Michael Powell, the Immutable product marketing lead, put to rest some of the concerns, stating,  “A lot of blockchain purists are very big into the idea of decentralization and that everything has to be on-chain, and that’s a massive deviation from where game developers actually build.” Immutable-Polygon Collaboration  Immutable also began the public testing of its zero-knowledge Ethereum Virtual Machine (zkEVM) in collaboration with Polygon Labs.
According to Immutable, the zero-knowledge Ethereum Virtual Machine will help lower developmental costs for game developers while ensuring security and providing the network effects that come with the larger Ethereum ecosystem.  Immutable was valued at $2.5 billion in March 2022 after a successful $200 million Series C funding round. Temasek, a Singaporean state-owned investment firm, led the round.
0 notes
pumpp89 · 1 year
Text
Your Trusted Cloud Optimization Planning Partner in the USA
In today's fast-paced digital landscape, harnessing the potential of cloud computing is essential for businesses seeking to stay competitive and agile. Amazon Web Services (AWS) has emerged as a leader in the cloud industry, offering a wide array of services that empower organizations to scale, innovate, and streamline their operations. To help businesses make the most of AWS and its cloud event services, a reliable partner is essential. In the USA, one such partner stands out as a beacon of expertise and innovation: the Free Aws Cloud Event Services USA and Cloud Optimization Planning Company.
Tumblr media
Unveiling the AWS Cloud Event Services Advantage AWS Cloud Event Services form the backbone of modern cloud-based applications and infrastructure. They enable businesses to seamlessly integrate, monitor, and respond to events across their AWS environments. These services facilitate real-time data processing, event-driven architectures, and automation, paving the way for unparalleled scalability, resilience, and cost efficiency. Key AWS Cloud Event Services include AWS EventBridge, Amazon CloudWatch Events, and AWS Step Functions.
1. AWS EventBridge: AWS EventBridge provides a serverless event bus that connects applications using events. It simplifies event-driven architecture by routing events from various sources to targets like AWS Lambda functions or AWS SNS topics. This enables businesses to build responsive and decoupled systems.
2. Amazon CloudWatch Events: Amazon CloudWatch Events makes it easy to respond to changes in your AWS resources. It can trigger automated responses such as scaling, remediation, or notifications based on predefined rules. This service enhances operational efficiency and system reliability.
3. AWS Step Functions: AWS Step Functions is a serverless orchestration service that allows businesses to coordinate distributed applications and microservices. It makes it simple to build and run applications that respond to various events and workflows.
The Free AWS Cloud Event Services and Cloud Optimization Planning Company Advantage For businesses in the USA, optimizing AWS Cloud Event Services and achieving peak performance across the AWS ecosystem can be a daunting task. This is where the Free AWS Cloud Event Services and Cloud Optimization Planning Company comes into play. Here's why they are your ideal partner:
1. Expertise in AWS Cloud Event Services: With a team of certified AWS professionals, this company possesses in-depth knowledge and hands-on experience with AWS Cloud Event Services. They can design, implement, and manage event-driven architectures tailored to your business needs.
2. Comprehensive Cloud Optimization Planning: The company offers holistic cloud optimization planning services, ensuring that your AWS infrastructure is cost-efficient, secure, and high-performing. They conduct thorough assessments and provide actionable recommendations.
3. Proactive Monitoring and Management: Continuous monitoring and management of AWS Cloud Event Services ensure that your systems are always responsive and efficient. They proactively identify and address issues, minimizing downtime and maximizing ROI.
4. Scalability and Flexibility: As your business grows, the company can scale AWS resources and event-driven solutions to accommodate changing demands. This scalability ensures that your cloud infrastructure keeps pace with your evolving requirements.
5. Cost Optimization: The company specializes in cost optimization, helping you avoid overspending on AWS services. By analyzing your usage patterns and implementing cost-saving strategies, they ensure you get the most value from your AWS investments.
In conclusion, AWS Cloud Event Services offer a powerful framework for modern businesses to build dynamic, event-driven applications. To unlock their full potential, partnering with a dedicated AWS Cloud Event Services and Cloud Optimization Planning Company in the USA is paramount. With their expertise, you can harness the capabilities of AWS Cloud Event Services while optimizing your cloud environment for efficiency, scalability, and cost-effectiveness. Embrace the future of cloud computing with the right partner by your side.
0 notes
axoloth · 1 year
Text
Amazon Web Services (AWS)
1. Compute Services:
   - Amazon Elastic Compute Cloud (EC2)
   - Amazon Elastic Container Service (ECS)
   - AWS Lambda
   - AWS Elastic Beanstalk
   - AWS Batch
   - Amazon Lightsail
   - AWS Fargate
2. Storage and Content Delivery Services:
   - Amazon Simple Storage Service (S3)
   - Amazon Elastic Block Store (EBS)
   - Amazon Elastic File System (EFS)
   - Amazon Glacier
   - AWS Storage Gateway
   - Amazon CloudFront
   - Amazon Snowball
3. Database Services:
   - Amazon Relational Database Service (RDS)
   - Amazon DynamoDB
   - Amazon Redshift
   - Amazon ElastiCache
   - Amazon Neptune
   - Amazon DocumentDB
   - Amazon Quantum Ledger Database (QLDB)
4. Networking and Content Delivery Services:
   - Amazon Virtual Private Cloud (VPC)
   - Elastic Load Balancing (ELB)
   - AWS Direct Connect
   - Amazon Route 53
   - Amazon API Gateway
   - AWS Global Accelerator
5. Security, Identity, and Compliance Services:
   - AWS Identity and Access Management (IAM)
   - AWS Key Management Service (KMS)
   - AWS Secrets Manager
   - AWS Shield
   - AWS WAF (Web Application Firewall)
   - Amazon Cognito
   - AWS Certificate Manager (ACM)
6. Management and Governance Services:
   - AWS CloudFormation
   - AWS CloudTrail
   - AWS Systems Manager
   - Amazon CloudWatch
   - AWS Auto Scaling
   - AWS Trusted Advisor
   - AWS Config
7. Analytics Services:
   - Amazon Athena
   - Amazon Kinesis
   - Amazon Redshift Spectrum
   - Amazon QuickSight
   - AWS Glue
   - AWS Data Pipeline
8. AI and Machine Learning Services:
   - Amazon Polly
   - Amazon Lex
   - Amazon Rekognition
   - Amazon SageMaker
   - Amazon Transcribe
   - Amazon Comprehend
   - AWS DeepLens
9. Application Integration Services:
   - Amazon Simple Queue Service (SQS)
   - Amazon Simple Notification Service (SNS)
   - Amazon Simple Workflow Service (SWF)
   - AWS Step Functions
   - Amazon EventBridge
10. Mobile Services:
    - AWS Mobile Hub
    - AWS Device Farm
    - AWS Mobile Analytics
    - AWS Pinpoint
11. Developer Tools:
    - AWS CodeStar
    - AWS CodeCommit
    - AWS CodeBuild
    - AWS CodePipeline
    - AWS CodeDeploy
    - AWS X-Ray
12. Internet of Things (IoT):
    - AWS IoT Core
    - AWS IoT Analytics
    - AWS IoT Device Management
    - AWS IoT Events
    - AWS IoT Greengrass
13. Blockchain Services:
    - Amazon Managed Blockchain
14. Game Development:
    - Amazon GameLift
0 notes
Text
ALEXA SKILL IMPLEMENTATION FOR A GAZETTE
Executive Summary
We were approached by a long-established newspaper publisher in California to develop an Alexa skill for their weekly newspaper. The project entailed converting their WordPress website into highly interactive news interface with Alexa skill, allowing users to explore, search, and listen to the latest news from the newspaper using their voice on multimodal devices such as Echo Show Family, Fire TV, and Echo Dot with interactive APL screen designs. The skill provides a convenient way for visitors to access breaking news, multimedia, and archives, to increase engagement and retention and reach a new audience of voice-enabled device users. Our ability to deliver this innovative and effective solution demonstrates our expertise in meeting such specific needs of publishers.
About our Client
Client : Confidential
Location: USA
Industry: Media & Entertainment
Technologies
Python, Alexa Skill Kit, Alexa Presentation Language (APL),  AWS – Lambda, DynamoDB, Polly, S3, CloudWatch, EventBridge, Revive Ad Server
Download Full Case Study
0 notes
udauda · 2 years
Text
SAA #151
すべての IAM ユーザーのアクセスキーを90日ごとにローテーション
→AWS Config ルールを作成し、キーの有効期限を確認→Amazon EventBridge(Amazon CloudWatch Events)ルールを定義して Lambda 関数がキーを削除するようスケジューリン��する
0 notes
cloudemind · 4 years
Text
11 Dịch vụ AWS Serverless xịn nên sử dụng trong kiến trúc cloud
Có bài viết học luyện thi AWS mới nhất tại https://cloudemind.com/aws-serverless-services/ - Cloudemind.com
11 Dịch vụ AWS Serverless xịn nên sử dụng trong kiến trúc cloud
Serverless services là những dịch vụ thuộc dạng fully managed services có nghĩa là mọi thứ liên quan đến hạ tầng underlying hardware, provision, maintenance, patching, hay thậm chí cao hơn về bảo mật sử dụng cũng được AWS làm sẵn cho mình. Serverless có thể gọi là lý tưởng cho các developer thay vì nỗi lo về máy chủ cấp phát, cấp phát có thừa hay quá thiếu hay không, sau khi đưa vào sử dụng có cần phải update, upgrade patching gì không (Đây thực sự là ác mộng với các bạn developer mà ko rành về hạ tầng, về CLI, shell này nọ).
AWS hiểu được điều này và offer một số dạng dịch vụ gọi là serverless như thế.
Serverless là gì?
Serverless dịch Tiếng Việt là “Phi máy chủ” nhưng dịch ra có vẻ hơi ngớ ngẩn, mình cũng không biết dịch sao cho thoát nghĩa. Serverless không phải là không có máy chủ theo nghĩa đen, bản chất của serverless nằm lớp bên dưới vẫn là các máy chủ, nhưng AWS làm cho việc này một cách mờ đi (transparency hay invisibility) và tự động quản lý vận hành lớp hạ tầng này miễn sao cung cấp đủ capacity như thiết kế cho bạn.
AWS Serverless
Definition of serverless computing: Serverless computing is a cloud computing execution model in which the cloud provider allocates machine resources on demand, taking care of the servers on behalf of the their customers – Wikipedia
Serverless refers to applications where the management and allocation of servers and resources are completely managed by the cloud provider. – Serverless-Stack
Serverless is a cloud-native development model that allows developers to build and run applications without having to manage servers. There are still servers in serverless, but they are abstracted away from app development. – Redhat
Kevin tập trung vào việc phát triển các product sử dụng Cloud Native cho nên luôn ưu tiên các dịch vụ Serverless trong kiến trúc của mình để tăng tốc độ phát triển, dễ dàng scaling, và chi phí cũng rẻ hơn rất nhiều so với cách làm Cloud truyền thống.
Nào, mình cùng điểm qua các dịch vụ AWS Serverless hiện có nào:
1. AWS Lamda
Type: Compute Services
Description: Chạy code không cần quan tâm đến máy chủ, hỗ trợ coding bằng các ngôn ngữ phổ biến như: Python, Node.js, Go, Java. Đặc biệt hơn từ 2020 Lambda mở rộng năng lực tới 6vCPU và 10GB RAM và hỗ trợ chạy docker.
Pricing Model:
Number of requests
Duration of execution
Reference: https://aws.amazon.com/lambda/
2. Amazon API Gateway
Type: API, proxy
Description: Giúp bạn triển khai các API ở quy mô lớn, hỗ trợ Restfull và Websocket APIs,
Pricing Model:
Number of requests
Caching
Reference: https://aws.amazon.com/api-gateway/
3. Amazon DynamoDB
Type: NoSQL DB
Description: Dịch vụ CSDL NoSQL của AWS, hỗ trợ keyvalue-pair và document DB. Đây là loại cơ sở dữ liệu có tốc độ truy xuất rất nhanh tính bằng single-digit-milisecond, nếu kết hợp thêm Cache của DAX nữa sẽ giảm xuống còn micro-milisecond, có thể scale tới 20 triệu request per second.
Pricing Model (on-demand and provisioned):
Write Capacity Unit
Read Capacity Unit
Storage
Data Transfer
etc
Reference: https://aws.amazon.com/dynamodb/
4. Amazon EventBridge
Type: Controller
Description: Amazon EventBridge được xem như event bus là nơi tập trung sự kiện của nhiều loại ứng dụng SaaS và AWS Services. EventBridge thu thập các sự kiện từ nhiều loại ứng dụng như Zendesk, Datadog, Pagerduty và route các dữ liệu này đến AWS Lambda. Mình cũng có thể setup các rule để route dữ liệu này đến các ứng dụng khác nhau. EventBridge giúp bạn build các ứng dụng hướng sự kiện (event-driven-application). EventBridge schema hỗ trợ Python, Typescript, Java giúp developer thuận tiện trong quá trình phát triển ứng dụng.
Pricing Model:
Pay for events to your event bus
Events ingested to Schema Discovery
Event Replay
Reference: https://aws.amazon.com/eventbridge/
5. Amazon SNS (Simple Notification Service)
Type: Messaging
Description: dịch vụ messaging pub/sub hỗ trợ SMS, Email, mobile push notification.
Pricing Model:
Number of requests
Notification deliveries
Data Transfer
Reference: https://aws.amazon.com/sns/
6. Amazon SQS (Simple Queue Service)
Type: Messaging, Queuing
Description: Message queue, xây dựng các hàng chờ cho các thông tin giúp decoupling nhiều nhóm dịch vụ, cũng là cách giúp các ứng dụng triển khai trên cloud tăng tính Reliable. SQS hỗ trợ standard queue để tăng tối đa throughput và FIFO queue để đảm bảo message được delivery chính xác một lần theo thứ tự gởi đi.
Pricing Model:
Number of requests
Data Transfer
Reference: https://aws.amazon.com/sqs/
7. Amazon S3 (Simple Storage Service)
Type: Storage
Description: Dịch vụ lưu trữ file dạng đối tượng (Object). S3 cung cấp khả năng lưu trữ vô hạn trong mỗi bucket, mỗi file lưu trữ có thể tới 5TB, quản lý dễ dàng thông qua AWS Management Console, API và CLI. S3 cũng dễ dàng tích hợp các dịch vụ AWS khác rất sâu như dịch vụ về governance, phân tích dữ liệu, machine learning, web integration…
Pricing Model:
Storage actual usage
Request type (PUT, GET, LIST…)
Data transfer
Retrieving
Reference: https://aws.amazon.com/s3
8. AWS AppSync
Type: API, Mobile service
Description: AppSync là dịch vụ AWS cho phép xây dựng các ứng dụng dạng real-time communication như data-driven mobile app hay web app với sự hỗ trợ của của GraphQL APIs.
Pricing Model:
Query operation
Data modification operation
real-time update data
Reference: https://aws.amazon.com/appsync/
9. AWS Fargate
Type: Compute, container
Description: Serverless compute khi dùng với container. Fargate có thể dùng với cả EKS và ECS (orchestration)
Pricing Model:
Resource vCPU per hour
Resource RAM per hour
10. AWS Step Function
Type: Controller, Cron job
Description: Đã qua cái thời mà viết các cron job ở hệ điều hành rồi rẽ nhánh theo các business logic tương ứng. AWS Step Function là dịch vụ giúp bạn build các ứng dụng xử lý logic theo các bước nhảy thời gian thông qua các state machine. Đây là dịch vụ rất rất đỉnh.
Pricing Model:
State Transition.
Reference: https://aws.amazon.com/step-functions/
11. Amazon RDS Aurora Serverless
Type: Database, SQL
Description: Aurora là một loại engine trong Amazon RDS được đưa ra bởi AWS (AWS Property). Aurora MySQL nhanh 5x so với MySQL thông thường và 3x so với Postgres. Khác với DynamoDB, Aurora là SQL service. Một ứng dụng lớn bạn có thể phải kết hợp nhiều loại DB services để đem lại hiệu năng tốt nhất.
Pricing model:
ACU (Aurora Capacity Unit)
Storage
Reference: https://aws.amazon.com/rds/aurora/serverless/
Conclusion
Kevin tin rằng sẽ ngày càng có nhiều dịch vụ hướng Serverless và chi phí sử dụng cloud ngày càng được tối ưu có lợi cho người dùng. Cảm ơn vì AWS, Azure, GCP đang ngày càng đưa ra nhiều dịch vụ cloud tốt.
Have fun!
Xem thêm: https://cloudemind.com/aws-serverless-services/
0 notes
ho2k-com · 2 months
Text
0 notes
karonbill · 2 years
Text
AWS SOA-C02 Questions and Answers
Want to pass SOA-C02 AWS Certified SysOps Administrator - Associate exam? PassQuestion provides the latest AWS SysOps Administrator Associate SOA-C02 Questions and Answers which are highly expected to be asked in the SOA-C02 actual exam. In this way, you can save a lot of time and do your necessary daily routine tasks without worrying about your exam preparation. AWS SysOps Administrator Associate SOA-C02 Questions and Answers are designed on the pattern of real exams that will definitely help you to pass the SOA-C02 exam on the first attempt. It will boost confidence to appear successfully in the real exam.
AWS Certified SysOps Administrator – Associate (SOA-C02) Exam
This credential helps organizations identify and develop talent with critical skills for implementing cloud initiatives. Earning AWS Certified SysOps Administrator - Associate demonstrates experience deploying, managing, and operating workloads on AWS.
To earn this certification, you'll need to take and pass the AWS Certified SysOps Administrator - Associate exam. The exam features a combination of three possible question formats, including multiple-choice, multiple responses, and exam labs. Exam labs allow you to showcase your skills by building solutions using the AWS Management Console and AWS Command Line Interface (CLI).
Exam Overview
Level: Associate Length: 180 minutes to complete the exam Cost: 150 USD Visit Exam pricing for additional cost information. Format: 65 scoring opportunities that may be multiple choice, multiple response, or exam lab Delivery method: Pearson VUE testing center or online proctored exam Languages: English, Japanese, Korean, and Simplified Chinese.
Exam Domain
Domain 1: Monitoring, Logging, and Remediation 20% Domain 2: Reliability and Business Continuity 16% Domain 3: Deployment, Provisioning, and Automation 18% Domain 4: Security and Compliance 16% Domain 5: Networking and Content Delivery 18% Domain 6: Cost and Performance Optimization 12%
View Online AWS Certified SysOps Administrator – Associate (SOA-C02) Free Questions
A company is expanding its fleet of Amazon EC2 instances before an expected increase of traffic. When a SysOps administrator attempts to add more instances, an InstanceLimitExceeded error is returned. What should the SysOps administrator do to resolve this error? A.Add an additional CIDR block to the VPC. B.Launch the EC2 instances in a different Availability Zone. C.Launch new EC2 instances in another VPC. D.Use Service Quotas to request an EC2 quota increase. Answer: D
A SysOps administrator developed a Python script that uses the AWS SDK to conduct several maintenance tasks. The script needs to run automatically every night. What is the MOST operationally efficient solution that meets this requirement? A.Convert the Python script to an AWS Lambda (unction. Use an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke the function every night. B.Convert the Python script to an AWS Lambda function. Use AWS CloudTrail to invoke the function every night. C.Deploy the Python script to an Amazon EC2 Instance. Use Amazon EventBridge (Amazon CloudWatch Events) to schedule the instance to start and stop every night. D.Deploy the Python script to an Amazon EC2 instance. Use AWS Systems Manager to schedule the instance to start and stop every night. Answer: A
A company uses AWS Cloud Formation templates to deploy cloud infrastructure. An analysis of all the company's templates shows that the company has declared the same components in multiple templates. A SysOps administrator needs to create dedicated templates that have their own parameters and conditions for these common components. Which solution will meet this requirement? A.Develop a CloudFormaiion change set. B.Develop CloudFormation macros. C.Develop CloudFormation nested stacks. D.Develop CloudFormation stack sets. Answer: C
An errant process is known to use an entire processor and run at 100% A SysOps administrator wants to automate restarting the instance once the problem occurs for more than 2 minutes How can this be accomplished? A.Create an Amazon CloudWatch alarm for the Amazon EC2 instance with basic monitoring Enable an action to restart the instance B.Create a CloudWatch alarm for the EC2 instance with detailed monitoring Enable an action to restart the instance C.Create an AWS Lambda function to restart the EC2 instance triggered on a scheduled basis every 2 minutes D.Create a Lambda function to restart the EC2 instance, triggered by EC2 health checks Answer: B
A company using AWS Organizations requires that no Amazon S3 buckets in its production accounts should ever be deleted. What is the SIMPLEST approach the SysOps administrator can take to ensure S3 buckets in those accounts can never be deleted? A.Set up MFA Delete on all the S3 buckets to prevent the buckets from being deleted. B.Use service control policies to deny the s3:DeleteBucket action on all buckets in production accounts. C.Create an IAM group that has an IAM policy to deny the s3:DeleteBucket action on all buckets in production accounts. D.Use AWS Shield to deny the s3:DeleteBucket action on the AWS account instead of all S3 buckets. Answer: B
While setting up an AWS managed VPN connection, a SysOps administrator creates a customer gateway resource in AWS. The customer gateway device resides in a data center with a NAT gateway in front of it. What address should be used to create the customer gateway resource? A.The private IP address of the customer gateway device B.The MAC address of the NAT device in front of the customer gateway device C.The public IP address of the customer gateway device D.The public IP address of the NAT device in front of the customer gateway device Answer: D
A company wants to be alerted through email when IAM CreateUser API calls are made within its AWS account. Which combination of actions should a SysOps administrator take to meet this requirement? (Choose two.) A.Create an Amazon EventBridge (Amazon CloudWatch Events) rule with AWS CloudTrail as the event source and IAM CreateUser as the specific API call for the event pattern. B.Create an Amazon EventBridge (Amazon CloudWatch Events) rule with Amazon CloudSearch as the event source and IAM CreateUser as the specific API call for the event pattern. C.Create an Amazon EventBridge (Amazon CloudWatch Events) rule with AWS IAM Access Analyzer as the event source and IAM CreateUser as the specific API call for the event pattern. D.Use an Amazon Simple Notification Service (Amazon SNS) topic as an event target with an email subscription. E.Use an Amazon Simple Email Service (Amazon SES) notification as an event target with an email subscription. Answer: A, D
A company is testing Amazon Elasticsearch Service (Amazon ES) as a solution for analyzing system logs from a fleet of Amazon EC2 instances. During the test phase, the domain operates on a single-node cluster. A SysOps administrator needs to transition the test domain into a highly available production-grade deployment. Which Amazon ES configuration should the SysOps administrator use to meet this requirement? A.Use a cluster of four data nodes across two AWS Regions. Deploy four dedicated master nodes in each Region. B.Use a cluster of six data nodes across three Availability Zones. Use three dedicated master nodes. C.Use a cluster of six data nodes across three Availability Zones. Use six dedicated master nodes. D.Use a cluster of eight data nodes across two Availability Zones. Deploy four master nodes in a failover AWS Region. Answer: B
0 notes
lakshya01 · 3 years
Text
Integrating AWS Security Hub with Splunk via Amazon Event Bridge.
Tumblr media
Amazon Security Hub
Amazon Security Hub gives you a comprehensive view of your security alerts and security posture across your Amazon Web Services accounts. There are a range of powerful security tools at your disposal, from firewalls and endpoint protection to vulnerability and compliance scanners.
https://docs.aws.amazon.com/securityhub/?id=docs_gateway
Amazon Event Bridge
Amazon EventBridge is a serverless event bus that makes it easier to build event-driven applications at scale using events generated from your applications, integrated Software-as-a-Service (SaaS) applications, and AWS services. EventBridge delivers a stream of real-time data from event sources such as Zendesk or Shopify to targets like AWS Lambda and other SaaS applications. You can set up routing rules to determine where to send your data to build application architectures that react in real-time to your data sources with event publisher and consumer completely decoupled.
https://docs.aws.amazon.com/eventbridge/index.html
CloudWatch Log  Groups
CloudWatch Logs enables you to centralize the logs from all of your systems, applications, and AWS services that you use, in a single, highly scalable service. You can then easily view them, search them for specific error codes or patterns, filter them based on specific fields, or archive them securely for future analysis. CloudWatch Logs enables you to see all of your logs, regardless of their source, as a single and consistent flow of events ordered by time, and you can query them and sort them based on other dimensions, group them by specific fields, create custom computations with a powerful query language, and visualize log data in dashboards.
Splunk Enterprise
Splunk Enterprise Security (ES) provides security information and event management (SIEM) for machine data generated from security technologies such as network, endpoint, access, malware, vulnerability and identity information. It is a premium application that is licensed independently.
Now Steps for the integration..
Step 1 : First of all we have to enable the security hub .
Step 2 : Then in the findings sections of security hub we can see the findings (logs) but if you dont have you can go use anyone of these that are under securityhub  just for the simplicity I used Gaurdduty (https://aws.amazon.com/guardduty/) and generated sample logs.
Step3 :  So now its time to operate Amazon Event Bridge for the event source to the target destination (here, event source is Security Hub and Target Source is Cloudwatch log groups).
Step 4: When you landed in amazon event bridge create rules in rules and after that select these as our event source is security hub ,
Tumblr media
And for the target ,
Tumblr media
and click on create ..
Step 5 : Now its time to create User with permission (CloudWatchlogReadonlyaccess) and create ok .
NOTE: Don’t forget to copy th access id and secret id of the use you created.
Tumblr media Tumblr media
After that now its time to operate Splunk  so first of all go to the splunk and create install aws addons apps on the dashboard of Splunk ..
Step 6 : In the apps there is the option called find more apps so click on that and type aws in search bar you find this ,
Tumblr media
install it and after that add the access id and secret id that you generated for with the permission cloudwatchlogsreadonlyaccess in the configuration section you will get this leave the region as global.. 
Tumblr media
Step 7: After that you add index in the Splunk and in the settings you will get the index option as shown in the figure, 
Tumblr media
after that click on the new index and then type the name for the index in my case ,
Tumblr media
After that come to addons aws apps and click on input and create the input ain the create input there are different parameters for the inputs click on custom data types as shown in fig,
Tumblr media
Step 8: Here you get the option cloudwatchlogs click on it 
Tumblr media
enter the details and save and done ..
Step 9 : Now youi configuration is done now go to search & reporting and search by typing (index=(name that you created for index)), after that you see the log stream is fetching via Amazon Event Bridge Target side.
For the further information mail : [email protected]
1 note · View note
Text
Build a Serverless News Data Pipeline using ML on AWS Cloud
Build a Serverless News Data Pipeline using ML on AWS Cloud
By Maria Zentsova, Senior Data Analyst at Wood Mackenzie As an analyst, I spend a lot of time tracking news and industry updates. Thinking about this problem on my maternity leave, I’ve decided to build a simple app to track news on green tech and renewable energy.  Using AWS Lambda and other AWS services like EventBridge, SNS, DynamoDB, and Sagemaker it’s very easy to get started and build a…
Tumblr media
View On WordPress
0 notes
globalmediacampaign · 3 years
Text
Capture changes from Amazon DocumentDB via AWS Lambda and publish them to Amazon MSK
When using a document data store as your service’s source of truth, you may need to share the changes of this source with other downstream systems. The data events that are happening within this data store can be converted to business events, which can then be sourced into multiple microservices that implement different business functionalities. Capturing the changes from data sources is called change data capture (CDC); you can implement it in different ways by different data technologies. In the case of Amazon DocumentDB (with MongoDB compatibility), you can implement CDC via change streams functionality. This feature simplifies the process to listen to committed changes to documents in a set of collections in real time. The events are also time-ordered within a stream, which makes the stream a reliable mechanism for state replication scenarios. In this post, I show how you can capture changes from Amazon DocumentDB by using AWS Lambda implemented in NodeJS. After the Lambda function captures the change events, it publishes them to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Architecture By completing the steps in this post, you can create a system that uses the architecture illustrated in the following image. The flow of events starts when we make changes within a collection residing in the Amazon DocumentDB database. As the changes arrive, Amazon DocumentDB copies them into a change stream dedicated to that collection. A Lambda function connects to this change stream and polls these events. After the function filters out events other than insert, update, and delete, it publishes them to a Kafka topic in an MSK cluster. A Lambda function is a stateless component, and it has a limited lifespan. Because the polling activity should be continuous, we need to run the Lambda function on a schedule. This architecture uses Amazon EventBridge to schedule the function to run every minute. In this sample architecture, each Lambda function triggered by the EventBridge engine connects to Amazon DocumentDB and watches for changes for a predefined time period (15 seconds in this case). At the end of each poll cycle, the function writes the last polled resume token to another collection in the same Amazon DocumentDB database. This checkpoint mechanism allows Lambda functions to resume the polling activity without needing to replay all the events from the beginning of the stream. This checkpointing mechanism should be in place even if we choose to use a long-running application using a virtual machine or container-based compute infrastructure. This is because if the underlying compute instance is restarted or scaled out, the new instance needs to have a starting point rather than process the whole history. A change stream can hold up to 7 days of information (determined by the change_stream_log_retention_duration parameter), which can translate to a significant number of change events for active applications. For this post, we use Amazon DocumentDB version 4.0. Deploy the stack To deploy the sample architecture into your AWS environment, we use an AWS Serverless Application Model (AWS SAM) template. The template creates the following resources in your account: An Amazon DocumentDB cluster (version 4.0) An MSK cluster A Lambda function (function-documentdb-stream-processor) that polls the change streams event from the Amazon DocumentDB cluster and publishes them to the MSK cluster An AWS Cloud9 environment, which allows you to configure source and destination systems and run your tests A VPC and subnets A NAT gateway and internet gateway Other supporting resources such as security groups and AWS Identity and Access Management (IAM) roles You will incur some costs after creating this environment. To start your deployment, clone the GitHub repository to your local machine and install and configure AWS SAM with a test IAM user. AWS SAM requires you to specify an Amazon Simple Storage Service (Amazon S3) bucket to hold the deployment artifacts. If you haven’t already created a bucket for this purpose, create one now. The bucket should be reachable by the IAM user you use for deploying AWS SAM packages. At the command line, navigate to the cloned GitHub repository’s folder and enter the following command to package the application: sam package --template template.yaml --output-template-file output_template.yaml --s3-bucket BUCKET_NAME_HERE Replace BUCKET_NAME_HERE with the name of the S3 bucket that holds the deployment artifacts. AWS SAM packages the application and copies it into the S3 bucket. When the AWS SAM package command finishes running, enter the following command to deploy the package: sam deploy --template-file output_template.yaml --stack-name Blogstack --capabilities CAPABILITY_IAM --parameter-overrides docDBUser=masterUsername docDBPass=masterPass docDBClusterName=docDBCluster mskClusterName=blog-msk-clstr In the preceding command, you can supply your own stack name by changing the stack-name parameter’s value. This template also allows you to provide the following input parameters and override their default values: docDBUser docDBPass docDBClusterName mskClusterName When you run this command, AWS SAM shows the progress of the deployment. The deployment takes around 15 minutes and creates a main stack and a dependent stack for the AWS Cloud9 environment in AWS CloudFormation. You can also track the overall deployment status on the AWS CloudFormation console. When the deployment is complete, AWS SAM outputs the following parameters, which you need while doing additional system configurations. These parameters are also available on the AWS CloudFormation console, on the Outputs tab of the deployed stack named Blogstack. Connecting to your AWS Cloud9 environment An AWS Cloud9 environment is created for you automatically when you deploy the AWS SAM package. You need to further provision this environment with MongoDB and Kafka command line tools. To start provisioning your AWS Cloud9 environment, follow the URL that was provided by the Cloud9URL output parameter of the deployed CloudFormation stack. When the environment starts, go to the terminal section. Configure Amazon DocumentDB You can now install mongo shell onto your AWS Cloud9 environment. Use the following commands in the terminal: echo -e "[mongodb-org-4.0] nname=MongoDB Repositorynbaseurl= https://repo.mongodb.org/yum/amazon/2013.03/mongodb-org/4.0/x86_64/ ngpgcheck=1 nenabled=1 ngpgkey= https://www.mongodb.org/static/pgp/server-4.0.asc" | sudo tee /etc/yum.repos.d/mongodb-org-4.0.repo sudo yum install -y mongodb-org-shell You also need Amazon DocumentDB CA certificates to connect to your cluster. Use the following command to download the certificate to the current folder (~/environment): wget https://s3.amazonaws.com/rds-downloads/rds-combined-ca-bundle.pem Enter the following command connect to your cluster: mongo --ssl --host DOCUMENTDB_CLUSTER_ENDPOINT_HERE:27017 --sslCAFile rds-combined-ca-bundle.pem --username DOCUMENTDB_USERNAME_HERE --password DOCUMENTDB_PASSWORD_HERE In the preceding command, provide the cluster endpoint of the Amazon DocumentDB cluster that was output from the AWS SAM installation. Also provide your username and password that you used during the sam deploy command. Create a database (blogdb): We create two collections in the database. The first collection is named blogcollection; we use it as the data source for the change stream integration. Use the following command to create the empty blogcollection: db.createCollection("blogcollection") Enable change stream on this collection by running the following adminCommand command: db.adminCommand({modifyChangeStreams: 1, database: "blogdb", collection: "blogcollection", enable: true}); You need to also enable change streams in the cluster’s parameter group before it can be used. You can enable Amazon DocumentDB change streams for all collections within a given database, or only for selected collections. We use the second collection, checkpoints, to store the checkpoint document that holds the last processed resume token: db.checkpoints.insert({_id: 1, checkpoint: 0}) You can now issue the exit command to exit the mongo shell and continue with the next step: exit Configure the MSK cluster To configure the MSK cluster, you need to install Kafka into your AWS Cloud9 environment. Use the following commands in your AWS Cloud9 terminal to download Kafka from the source, extract it, and navigate to the bin folder: wget https://apache.mirror.colo-serv.net/kafka/2.7.0/kafka_2.13-2.7.0.tgz tar -xzf kafka_2.13-2.7.0.tgz cd kafka_2.13-2.7.0/bin Kafka binaries we use in this post require Java 8 or later versions. Check your environment’s Java version with the following command: java -version If you see a version below 1.8, issue the below commands to upgrade it to Java 8. sudo yum -y install java-1.8.0-openjdk-devel sudo alternatives --config java Select the 1.8 versions from the list. Find the bootstrap servers of your MSK cluster: To find the bootstrap server hostnames for your MSK cluster, navigate to the Amazon MSK console and choose your cluster. In the Cluster summary pane on the Details tab, choose View client information and copy the bootstrap servers host/port pairs. Within Kafka installation’s bin directory, issue the following command to create a topic to hold the events published by function-documentdb-stream-processor: sudo ./kafka-topics.sh --create --topic blog-events --replication-factor 1 --partitions 1 --bootstrap-server MSK_BOOTSTRAP_SERVERS_HERE Replace MSK_BOOTSTRAP_SERVERS_HERE with the value of the host/port pairs from the previous step. Test the solution To test the setup from end to end, you need to open a second terminal in your AWS Cloud9 environment. On the Window menu, choose New Terminal. In the first terminal, make sure you’re in the bin folder of the Kafka installation and issue the following command to start listening to the records in the Kafka topic: sudo ./kafka-console-consumer.sh --topic blog-events --from-beginning --bootstrap-server MSK_BOOTSTRAP_SERVERS_HERE As before, provide the value of the bootstrap server host/port pairs. In the second terminal, use mongo shell to connect to the Amazon DocumentDB cluster the same way you did earlier. Issue the following command to insert a document into blogdb.blogcollection: use blogdb; db.blogcollection.insert({"title" : "Blog Title 1"}) Add another document with the following command: db.blogcollection.insert({"title" : "Blog Title 2"}) In the first terminal, observe the changes on the Kafka topic as you add different documents to the collection. Cleanup To clean up the resources you used in your account, delete the stack from the AWS CloudFormation console. You can also delete the bucket you used for packaging and deploying the AWS SAM application. Conclusion This architecture shows how to capture state changes from Amazon DocumentDB via its change streams functionality and send them to Amazon MSK. You can adapt similar architectures to apply to other use cases, such as query segregation, event sourcing, data duplication, and more. For more information about the stream’s functionality and other integrations, see Run full text search queries on Amazon DocumentDB (with MongoDB compatibility) data with Amazon Elasticsearch Service and Using Change Streams with Amazon DocumentDB. If you have any questions or comments about this post, please share them in the comments. If you have any feature requests for Amazon DocumentDB, email us at [email protected] About the author Murat Balkan is an AWS Solutions Architect based in Toronto. He helps customers across Canada to transform their businesses and build industry leading solutions on AWS. https://aws.amazon.com/blogs/database/capture-changes-from-amazon-documentdb-via-aws-lambda-and-publish-them-to-amazon-msk/
0 notes