[{"content":"Introduction Amazon Macie is a managed AWS service, machine learning powered security service that helps organizations discover, classify, and protect sensitive data stored in Amazon S3. It continuously evaluates data in S3 buckets to identify sensitive information such as Personally Identifiable Information (PII), financial records, and access credentials.\nOne of Macie’s key strengths is its ability to combine built-in intelligence with customization. It provides a rich set of managed data identifiers for common sensitive data types, while also allowing users to define custom detection rules using pattern matching techniques such as regular expressions.\nAmazon S3, being a highly scalable and widely used object storage service, often stores large volumes of critical and sensitive data. This makes it essential to have automated mechanisms in place to monitor and secure that data.\nIn this lab, we explore how Amazon Macie can be used to analyze S3 data, identify sensitive information, and integrate with other AWS services to enable real-time security monitoring and alerting.\nOverview of the Architecture The workflow includes:\nAmazon S3 for storing data Amazon Macie for sensitive data discovery Amazon EventBridge for event routing Amazon SNS for email notifications Step 1: Creating and Populating an S3 Bucket I created an S3 bucket and uploaded a mix of sensitive and non-sensitive files.\nThe dataset included:\nCredit card information Employee records (PII) AWS access keys License plate data A non-sensitive image file This setup helps evaluate how Macie differentiates between sensitive and non-sensitive content.\nStep 2: Enabling Amazon Macie Amazon Macie was enabled from the AWS console.\nOnce enabled, Amazon Macie provides a centralized dashboard that gives visibility into the security posture of your S3 data. It highlights automated discovery results, including the number of buckets being monitored.\nThe dashboard also includes a Data Security section, which provides insights into:\nPublic access configurations Encryption status Bucket sharing settings These metrics help quickly assess potential risks and misconfigurations across your S3 environment.\nStep 3: Running a Classification Job To analyze the data stored in the S3 bucket, I created a one-time classification job in Amazon Macie.\nDuring configuration, I selected:\nOne-time job – suitable for initial discovery and point-in-time analysis All managed data identifiers – leveraging Macie’s built-in detection for common sensitive data types such as PII, financial data, and credentials No custom identifiers (initially) – to first evaluate the effectiveness of default detection capabilities Amazon Macie performs intelligent sampling and pattern matching on objects within the selected S3 bucket. It uses:\nMachine learning models to identify anomalies\nPattern matching for structured data (e.g., credit card numbers, access keys)\nContextual analysis to reduce false positives\nJob type selection: One-time jobs are ideal for audits, while scheduled jobs are better for continuous monitoring\nIdentifier coverage: Managed identifiers provide broad coverage but may miss region-specific or custom data formats\nScan depth: Deeper scans increase detection accuracy but may impact cost and execution time\nAfter submitting the job, Macie processed the objects and generated findings based on detected sensitive data.\nStep 4: Reviewing Findings Once the classification job completed, Amazon Macie generated detailed findings based on the sensitive data detected within the S3 bucket.\nFinancial Data Macie identified financial data such as credit card numbers and classified these as high severity findings. These types of exposures pose immediate risk due to their potential for fraud and regulatory implications.\nPersonal Data Personally Identifiable Information (PII), including employee records, was classified as medium severity findings. Macie uses managed data identifiers to recognize patterns such as names, addresses, and structured personal data.\nKey Observations High severity findings were associated with financial data and exposed credentials, indicating high exploitability PII detection demonstrates Macie’s effectiveness in identifying structured personal data Non-sensitive objects (e.g., images) were correctly ignored, reducing noise and false positives Context-aware classification helps prioritize remediation based on risk level The plates.txt file containing Australian license plate data was not flagged.\nThis highlights an important limitation:\nManaged data identifiers cover common global patterns, but may not detect region-specific or custom data formats.\nThis gap reinforces the need for custom data identifiers, which are addressed in the next step.\nStep 5: Setting Up Notifications (SNS + EventBridge) To enable real-time visibility into security findings, I implemented an event-driven notification pipeline using Amazon SNS and EventBridge. Instead of manually checking the Macie dashboard, this setup ensures that any new findings automatically trigger alerts.\nSNS Topic I created an SNS topic to act as the notification channel. This topic serves as the endpoint where security events are published.\nEmail Subscription An email subscription was added to the SNS topic, allowing findings to be delivered directly to an inbox providing immediate awareness of potential data exposure risks.\nEventBridge Rule I configured an EventBridge rule to listen for Amazon Macie finding events and route them to the SNS topic.\nHow the workflow operates Amazon Macie detects sensitive data and generates a finding EventBridge captures the finding event in real time The rule routes the event to the SNS topic SNS delivers a notification to subscribed endpoints (email) This architecture demonstrates a decoupled, event-driven security workflow, where detection and notification are loosely coupled.\nEnables real-time alerting without manual intervention Scales easily to additional targets (e.g., Lambda, SIEM tools) Forms the foundation for automated remediation pipelines This approach is critical in production environments where rapid response to data exposure is required.\nStep 6: Adding a Custom Data Identifier To detect Australian license plates—data that was not identified during the initial scan—I created a custom data identifier using a regular expression (regex). This allowed Amazon Macie to recognize a specific data pattern that is not included in its default managed identifiers.\nRunning a Custom Classification Job After defining the custom identifier, I created a second Amazon Macie classification job and included it in the configuration. This ensured that Macie would scan for both standard sensitive data types and the newly defined pattern.\nResults from Custom Detection Once the job completed, Amazon Macie successfully detected license plate data that was previously missed during the initial scan.\nThis demonstrates the value of custom data identifiers in extending Amazon Macie’s detection capabilities, particularly for identifying region-specific or business-specific sensitive data that is not covered by default rules.\nKey Insights Amazon Macie effectively identifies sensitive data such as PII, credentials, and financial information Managed data identifiers provide strong out-of-the-box detection capabilities Custom data identifiers enable tailored detection for organization-specific use cases Amazon SNS and Amazon EventBridge support event-driven security workflows and automated responses Sensitive data stored in Amazon S3 can be continuously monitored, classified, and alerted on Conclusion This lab demonstrated how Amazon Macie can be used to identify and monitor sensitive data in Amazon S3 through automated classification jobs.\nBy integrating Macie with Amazon EventBridge and Amazon SNS, it is possible to build real-time alerting mechanisms that improve visibility and response times to potential data exposure risks.\nExtending detection with custom data identifiers further enhances Macie’s ability to detect organization-specific and region-specific sensitive data patterns that are not covered by default rules.\nOverall, Amazon Macie plays a critical role in strengthening data security in AWS environments by enabling proactive detection, classification, and protection of sensitive information.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post6/macie/","summary":"Introduction Amazon Macie is a managed AWS service, machine learning powered security service that helps organizations discover, classify, and protect sensitive data stored in Amazon S3. It continuously evaluates data in S3 buckets to identify sensitive information such as Personally Identifiable Information (PII), financial records, and access credentials.\nOne of Macie’s key strengths is its ability to combine built-in intelligence with customization. It provides a rich set of managed data identifiers for common sensitive data types, while also allowing users to define custom detection rules using pattern matching techniques such as regular expressions.","title":"Automating Data Security in S3 with Amazon Macie"},{"content":"Introduction Amazon Inspector is a vulnerability management service that continuously scans AWS resources for security flaws, package vulnerabilities, and insecure coding practices.\nIt supports multiple resource types, including:\nLambda Functions – scans package layers and code for vulnerabilities EC2 Virtual Servers – deep inspection of Linux and Windows OS packages Elastic Container Registry (ECR) – scans container images for CVEs and malware Code Repositories – integrates with GitHub and GitLab to detect insecure code Amazon Inspector Dashboard and Findings The Amazon Inspector dashboard provides a centralized view of your environment’s security posture, including coverage across EC2, ECR, Lambda, and code repositories.\nIt highlights findings categorized by severity, allowing teams to prioritize critical vulnerabilities that are easier for attackers to exploit.\nEach finding includes:\nExploit availability – whether public exploits exist Fix availability – whether remediation is possible In this lab, I explored how Amazon Inspector scans container images pushed to Amazon ECR, as well as code pushed to GitHub.\nExploring ECR Image Scanning Within ECR, we can view the available container images:\nFor this demonstration, I used the Juice Shop image, a deliberately vulnerable application commonly used for security testing.\nAfter pushing the image to ECR, Amazon Inspector automatically detected it and initiated a vulnerability scan.\nInspector Findings for the Image Amazon Inspector successfully captured the image and generated findings based on identified vulnerabilities in the container.\nThe findings include several critical vulnerabilities, which can be expanded to view more detailed information about each issue.\nThe Inspector score provides insights into the severity and exploitability of the vulnerabilities, supported by detailed vulnerability intelligence to guide remediation efforts.\nDetailed Findings and Vulnerability Insights Inspector provides deep visibility into each vulnerability.\nExpanding a finding reveals:\nAccount ID, severity, affected packages, and timestamps Inspector score and vulnerability intelligence Integration with MITRE ATT\u0026amp;CK techniques External references such as vendor severity and tools like Tenable Vulnerability Database Search Amazon Inspector provides a built-in vulnerability database search feature that allows you to query specific vulnerabilities, such as CVE-2023-37466.\nThis enables you to:\nQuickly identify affected resources Understand the severity of the vulnerability Gain insight into exploitability and potential impact The results provide a clear view of the risk level, helping prioritize remediation efforts effectively.\nVulnerability Types and Remediation Amazon Inspector helps identify insecure coding practices that can introduce serious security risks into applications and workloads.\nSome common issues include:\nExcessive privileges – for example, running processes as root, which increases the impact if a system is compromised Credential exposure – such as accidentally logging or exposing AWS access keys in application logs Unsafe code execution – for example, using functions like pickle.loads, which can allow arbitrary code execution if untrusted input is processed These vulnerabilities can be exploited to gain unauthorized access, escalate privileges, or execute malicious code within an environment.\nRecommended Remediation To reduce these risks, best practices include:\nValidating and sanitizing all user input before processing it Avoiding unsafe system or subprocess calls where possible Replacing insecure libraries or functions with safer, modern alternatives Integration with Code Repositories Amazon Inspector now supports scanning of code stored in source code repositories, allowing developers to identify vulnerabilities early in the development lifecycle.\nThis feature enables continuous security analysis of code hosted in repositories, helping detect insecure coding patterns before deployment.\nConnecting GitHub for Code Scanning To enable scanning, we integrate Amazon Inspector with GitHub through the integration tab. By selecting “Install a new app”, we authenticate and choose the repositories we want to scan.\nOnce the integration is complete, the selected repositories are imported into the Inspector workspace, and their status is set to active, indicating they are ready for scanning.\nTriggering Code Scans At this stage, the scan status may remain inactive until changes are pushed to the repository. A code commit or update is required to trigger Amazon Inspector to begin scanning.\nAfter pushing changes, the scan is successfully triggered, and the status updates from inactive to active, indicating that analysis is in progress.\nFindings from Code Repositories The scanned repository (e.g., cybershujaa_Django) contains intentionally vulnerable code used for testing purposes. It is based on a deliberately vulnerable Django application originally created by nVisium.\nAmazon Inspector identifies multiple issues, including:\nSensitive data exposure (e.g., JWT tokens logged via print statements) SQL injection vulnerabilities General insecure coding practices Each finding includes:\nThe exact file and line number Severity level Suggested remediation steps This level of detail helps developers quickly locate and fix security issues within the codebase.\nInspector Dashboard Overview Returning to the dashboard, we can see a consolidated view of vulnerabilities from both:\nContainer images in ECR Code repositories integrated with GitHub The summary highlights:\nCritical findings Environment coverage Vulnerabilities with exploit availability Issues with available fixes Key Insight Amazon Inspector’s code security feature allows developers to shift security left by scanning repositories directly within the CI/CD workflow. It integrates seamlessly with GitHub, GitLab, and self-managed Git servers.\nThis ensures that vulnerabilities are detected early, reducing risk before applications are deployed into production.\nSummary Table: Amazon Inspector: Supported Resources and Scanning Overview The table below summarizes the key AWS resources supported by Amazon Inspector and the types of security scanning applied to each.\nResource Type Scan Focus Key Features Lambda Functions Package layers, code security Detects vulnerable code, excessive privileges, credential leaks EC2 Virtual Servers OS and package vulnerabilities Supports Linux \u0026amp; Windows, deep package inspection Elastic Container Registry (ECR) Container image vulnerabilities Scans pushed images, flags EOL OS, CVEs, malware Code Repositories (GitHub, GitLab) Code vulnerabilities Scans repositories for insecure code practices Conclusion Amazon Inspector provides a unified approach to vulnerability management across modern AWS workloads, including container images in Amazon ECR and source code stored in repositories such as GitHub.\nThrough this lab, we observed how Inspector automatically detects vulnerabilities after images are pushed and code changes are introduced, offering continuous scanning without manual intervention.\nIts integration with container workflows and code repositories enables a shift-left security approach, allowing teams to identify and remediate issues early in the development lifecycle.\nOverall, Amazon Inspector plays a key role in improving security visibility and helping teams prioritize and address critical vulnerabilities in cloud-native environments.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post5/inspector/","summary":"Introduction Amazon Inspector is a vulnerability management service that continuously scans AWS resources for security flaws, package vulnerabilities, and insecure coding practices.\nIt supports multiple resource types, including:\nLambda Functions – scans package layers and code for vulnerabilities EC2 Virtual Servers – deep inspection of Linux and Windows OS packages Elastic Container Registry (ECR) – scans container images for CVEs and malware Code Repositories – integrates with GitHub and GitLab to detect insecure code Amazon Inspector Dashboard and Findings The Amazon Inspector dashboard provides a centralized view of your environment’s security posture, including coverage across EC2, ECR, Lambda, and code repositories.","title":"End-to-End Vulnerability Management with Amazon Inspector"},{"content":"Introduction API stands for Application Programming Interface. API is a set of definitions and protocols for building and integrating application software. This interface allows different software programs to interact with each other by calling functions, passing data and accessing different capabilities.\nTypes of APIs The main types are:\nREST APIs - REST(Representational State Transfer) This is a common architecture style. They typically provide CRUD(create, Read,Update, Delete) operations and use HTTP requests such as GET, POST, PUT, DELETE. Data is usually returned in JSON or XML format. SOAP APIs - SOAP (Simple Object Access Protocol) is an older style of web service API that uses XML for messaging. SOAP APIs are more rigid than REST and require more bandwidth. GraphQL APIs - GraphQL is a newer API standard that allows clients to specify exactly what data they need in a query. It can be more efficient than REST for fetching specific fields. Webhook APIs - Webhooks allow apps to provide other applications with real-time information via callbacks. The receiving app registers a webhook which triggers an event on a certain action. Async APIs - Async APIs use message queues to enable asynchronous communication between apps. The app publishes a message rather than directly calling another app. Streaming APIs - Streaming APIs give clients a continuous stream of data in real-time. This is useful for apps like live video streaming. Microservices APIs - Microservices break down an app into small modular services with their own APIs. This allows for more flexible and scalable development as services can be independently maintained and updated. Lab This lab is based on Build your first CRUD API in 45 minutes or less! from AWS Workshops.\nPrerequisites\nDynamoDB for Data Storage: It integrates seamlessly with other AWS services, making it a reliable choice for storing and retrieving data in serverless applications. Lambda for Serverless Compute: It can be triggered by various events, such as HTTP requests from API Gateway, making it ideal for building serverless APIs. API Gateway for Endpoint Management: It integrates with Lambda functions, enabling you to define API endpoints that trigger serverless functions to process incoming requests. Cloud9 for Development Environment: It allows developers to write, test, and debug code directly in the cloud. Step 1: Setting up DynamoDB Setting up a DynamoDB table named \u0026ldquo;http-crud-tutorial-items\u0026rdquo; with a primary key \u0026ldquo;id.\u0026rdquo; DynamoDB is a fully managed NoSQL database service provided by AWS, offering scalability, performance, and reliability for handling structured data. Create a DynamoDB table\nStep 2: Creating a Lambda Function Next, we create a Lambda function named \u0026ldquo;http-crud-tutorial-function\u0026rdquo; that interacts with DynamoDB to perform CRUD operations. The Lambda function is written in Node.js 14.x and uses the AWS SDK to communicate with DynamoDB. It handles HTTP requests from API Gateway and executes corresponding operations on the DynamoDB table. Create a Lambda function and insert the code in the index.js\nconst AWS = require(\u0026#34;aws-sdk\u0026#34;); const dynamo = new AWS.DynamoDB.DocumentClient(); exports.handler = async (event, context) =\u0026gt; { let body; let statusCode = 200; const headers = { \u0026#34;Content-Type\u0026#34;: \u0026#34;application/json\u0026#34; }; try { switch (event.routeKey) { case \u0026#34;DELETE /items/{id}\u0026#34;: await dynamo .delete({ TableName: \u0026#34;http-crud-tutorial-items\u0026#34;, Key: { id: event.pathParameters.id } }) .promise(); body = `Deleted item ${event.pathParameters.id}`; break; case \u0026#34;GET /items/{id}\u0026#34;: body = await dynamo .get({ TableName: \u0026#34;http-crud-tutorial-items\u0026#34;, Key: { id: event.pathParameters.id } }) .promise(); break; case \u0026#34;GET /items\u0026#34;: body = await dynamo.scan({ TableName: \u0026#34;http-crud-tutorial-items\u0026#34; }).promise(); break; case \u0026#34;PUT /items\u0026#34;: let requestJSON = JSON.parse(event.body); await dynamo .put({ TableName: \u0026#34;http-crud-tutorial-items\u0026#34;, Item: { id: requestJSON.id, price: requestJSON.price, name: requestJSON.name } }) .promise(); body = `Put item ${requestJSON.id}`; break; default: throw new Error(`Unsupported route: \u0026#34;${event.routeKey}\u0026#34;`); } } catch (err) { statusCode = 400; body = err.message; } finally { body = JSON.stringify(body); } return { statusCode, body, headers }; }; Step 3: Configuring API Gateway We then configure a HTTP API using API Gateway to expose our Lambda function as RESTful endpoints. We create routes for GET, POST, PUT, and DELETE methods to perform CRUD operations on the DynamoDB table.\nCreate a HTTP API\nchoose GET the path, enter /items/{id} choose GET the path, enter /items choose PUT the path, enter /items choose DELETE the path, enter /items/{id} Integration type, choose Lambda function, enter http-crud-tutorial-function Step 4: Testing the API With our API set up, we use tools like CURL to test our endpoints. First we invoke the url, one can locate it from stages in your api details. Replace URL INVOKE_URL=\u0026quot;https://**abcdef123**.execute-api.eu-west-1.amazonaws.com\u0026quot; curl -X \u0026#34;PUT\u0026#34; -H \u0026#34;Content-Type: application/json\u0026#34; -d \u0026#34;{ \\\u0026#34;id\\\u0026#34;: \\\u0026#34;abcdef234\\\u0026#34;, \\\u0026#34;price\\\u0026#34;: 12345, \\\u0026#34;name\\\u0026#34;: \\\u0026#34;myitem\\\u0026#34; }\u0026#34; $INVOKE_URL/items Add as many items as possible Test CRUD Functions What I Learned Going through this lab, a few things stood out that I didn\u0026rsquo;t fully appreciate before.\nThe first was how little boilerplate you actually need to wire up a fully functional API on AWS. The Lambda function handles all four CRUD operations in under 60 lines of code, and API Gateway takes care of routing without you managing a single server. Coming from a background where setting up a REST API meant configuring a web server, installing dependencies, and managing processes, this felt almost too easy.\nThe second thing that surprised me was how tightly the routeKey pattern in Lambda ties to the API Gateway route definition. If the route in API Gateway is GET /items/{id} and your switch case doesn\u0026rsquo;t match that string exactly, the request falls through to the default error. It\u0026rsquo;s a small thing, but it\u0026rsquo;s the kind of detail that can cost you 30 minutes of debugging if you\u0026rsquo;re not paying attention.\nI also noticed that the lab uses Node.js 14.x, which is now end-of-life. In a real project I\u0026rsquo;d upgrade to Node.js 20.x and swap the AWS SDK v2 calls for the v3 modular client — it\u0026rsquo;s more tree-shakeable and better suited for Lambda\u0026rsquo;s cold start constraints.\nStep 5: Conclusion Building a simple CRUD API with AWS services empowers developers to create scalable and efficient solutions for managing data. By leveraging DynamoDB for storage, Lambda for serverless compute, API Gateway for endpoint management, and Cloud9 IDE for development and testing, this lab demonstrates a streamlined approach to serverless API development on AWS.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post4/crud/","summary":"Introduction API stands for Application Programming Interface. API is a set of definitions and protocols for building and integrating application software. This interface allows different software programs to interact with each other by calling functions, passing data and accessing different capabilities.\nTypes of APIs The main types are:\nREST APIs - REST(Representational State Transfer) This is a common architecture style. They typically provide CRUD(create, Read,Update, Delete) operations and use HTTP requests such as GET, POST, PUT, DELETE.","title":"Building a Simple CRUD API"},{"content":" AWS Cloud Quest: Solutions Architect — Building advanced cloud architectures\nIntroduction I recently completed the AWS Cloud Quest: Solutions Architect in July 2023 and wanted to share the experience and path I took to finish it. AWS Cloud Quest is an open-world, role-playing game that teaches you how to create AWS solutions using cloud concepts and real-life exercises. Unlike the Cloud Practitioner edition, the Solutions Architect role focuses on designing resilient, high-performing, and cost-optimized architectures. It features 26 scenario-based assignments that go well beyond the basics, covering topics like container services, serverless, CDN, and infrastructure automation.\nAWS Cloud Quest Essentials Topics Link to the game on Skill Builder AWS Skill Builder The 12 real-life scenario assignment labs include:\nHighly Available Web Applications Networking Concepts File Systems in the Cloud Core Security Concepts Cloud Computing Essentials Connecting VPCs First NoSQL Database Cloud Economics Databases in Practice Auto-Healing and Scaling Architecture Computing Solutions Cloud First Steps DNS Analyzing Network Traffic Single-Page App Container Services Decoupling Applications Auto-Healing and Scaling Applications Resource Monitoring Content Delivery Networks Backing up Data Deploying RESTful APIs Serverless Foundations API with Database Automation with CloudFormation Resource Governance In the next sections I\u0026rsquo;ll cover 4 main areas in terms of interface and world (it is a game after all), content and quests (how does the knowledge align with the game), user interaction and performance (how does the system and environment perform) and learning experience (how effective is it in teaching cloud concepts and practice). Followed by an overall summary of the review and recommendation compared to some other AWS free education programs.\nWhen you come to the Solution Center to build the solution and complete the quest, you have 4 steps you need to go through:\nLearn Plan Practice DIY 1: Learn In the learn section you can interact with a diagram of the solution you build and can watch videos of the concepts and services that you will use. I must admit, the diagram was really helpful and gets new to cloud learners used to the concept of solution or architecture diagrams as well as explain what the services are and the way they function. The less great part of this step was the fact that at times the videos took long to load, therefore give it time for the videos to load then watch them. They help one understand the services in the lab and give insights. I highly recommend you watch them even though there might be a glitch or a lag. 2: Plan In the plan section you will use the architecture diagram to creatively come up with the solution for the challenge. It displays the practice lab goals in which this section gives the steps to follow to achieve the goal and the DIY goals where one does this section using the knowledge gained. 3: Practice This is the interactive hands-on area. Here, you will get to implement an initial solution using the actual AWS Management Console itself. It will provide you with an AWS Console session that will expire in 2 hours and 30 minutes. Just as the name of this section implies, this part is mainly for practicing the steps for the final solution that you will implement in the next section. Once you click open AWS console button a new tab will open and you will be directed to the AWS Management Consol homepage. When you are through with the hands on section, this image will be displayed showing you\u0026rsquo;ve finished the section. Click on the DIY button to go to the next step. 4: DIY This is the final part of the Solution Center. Here, you apply your own solution based on the learnings that you have acquired in the Practice section. Conclusion AWS made learning Cloud fun to learn and one can learn maybe on lunch breaks or even on your free time. This game gives you a good amount of information about AWS Cloud and is both entertaining and interactive at the same time. Again, this is free to play for everyone and all you need is an Amazon account. I think AWS did a great job in developing the game, making it highly interactive and user-friendly.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post3/questsolutions/","summary":"AWS Cloud Quest: Solutions Architect — Building advanced cloud architectures\nIntroduction I recently completed the AWS Cloud Quest: Solutions Architect in July 2023 and wanted to share the experience and path I took to finish it. AWS Cloud Quest is an open-world, role-playing game that teaches you how to create AWS solutions using cloud concepts and real-life exercises. Unlike the Cloud Practitioner edition, the Solutions Architect role focuses on designing resilient, high-performing, and cost-optimized architectures.","title":"AWS Cloud Quest: Solutions Architect adventure"},{"content":" Steps to acing the AWS — Certified Cloud Practitioner (CCP) Exam — CLF-C01\nIntroduction I cleared the AWS- Cloud Practitioner Certification Exam back in December 2021 and thought of sharing my the experience and the path I took to complete this. AWS — Certified Cloud Practitioner (CCP) — CLF-C01 is a foundational level AWS certification. I believe that a certificate is only one validation method of existing skills. While preparing for this, I understood how AWS platform works and the basic building blocks behind it. I learnt most of the critical to small AWS services if not all. My background when I was sitting for the exam I was in my 3rd year of school where I was studying Bachelor of Science in Information Technology.\nAWS Cloud Practitioner Essentials course Link to the course on Skill Builder AWS Skill Builder This free AWS course covers technical aspects of AWS platform as needed in AWS Cloud Practitioner Certification Exam. It covers technical details on AWS services with fundamentals of AWS services to map as per your business requirements.\nIt covers AWS foundation services like Elastic Compute Cloud (EC2), Virtual Private Cloud (VPC), Simple Storage Service (S3), and Elastic Block Store (EBS) AWS Database services: DynamoDB and Relational Database Service (RDS) AWS Management services: Auto Scaling, CloudWatch, Elastic Load Balancing (ELB), and Trusted Advisor\nDomain 1: Cloud Concepts This subject area focuses on knowledge of cloud computing basics. It has 26% weighting in the exam.\nTo get full marks in this subject area, you should be able to define the basic components and architecture of AWS Cloud describe the value proposition of cloud against on-premise Specify AWS Cloud economics related aspects Have full awareness of different cloud architecture design principles like elasticity, scalability, high availability and fault tolerance.\nAWS cloud infrastructure offers AWS cloud infrastructure offers, services on compute, storage, security, database and management. AWS services includes Cloud Architecture Design Principles, are\nOperational Excellence Security Reliability Performance Efficiency Cost Optimization Domain 2: Security and Compliance This domain covers security aspects of AWS platform. It has 25% weighting in the exam.\nPassing this domain, requires you to be acquainted with\nAWS Cloud security tools like logging, protection against web attacks\nAWS Compliance to various legal requirement and security standards like data privacy, compliance reporting\nAWS resources for security support with trusted advisor\nAWS Shared Responsibility model Access management resources in AWS like MFA, IAM, password policies Domain 3: Technology Technology is the most crucial domain as it has 33% weighting in the exam. It focuses on how well you have grasped the core AWS platform’s working and services. Various services in AWS for provisioning IT infrastructure and the deployment of applications are\nAWS Elastic Beanstalk – A platform configuration defines the infrastructure and software stack to be used for a given environment.\nAWS CloudFormation – It is a service to model and set up your Amazon Web Services resources AWS OpsWorks – Configuration management service from AWS in 2 variations: AWS OpsWorks for Puppet Enterprise and AWS OpsWorks for Chef Automate. AWS CodeDeploy – AWS service for automation of code and software deployments to any AWS instance. AWS CodePipeline – A CI/CD or continuous integration/continuous delivery service from AWS Amazon EC2 Container Service – AWS container management service for Docker containers AWS Global Infrastructure involves knowledge of: Availability Zones (AZs), Regions, Edge Locations and Regional Edge Caches.\nDomain 4: Billing and Pricing Though it is the least weighting domain area of the AWS Cloud Practitioner Certification Exam with 16% weighting but, it is crucial for knowing the business aspect of AWS platform.\nThis section covers\nVarious pricing models available in AWS platform for various services Calculating AWS service costs Account structures in AWS for billing and pricing like BU based, project based accounting structure AWS billing support resources like cost calculator That\u0026rsquo;s all folks After completing all lessons, practice extensively to score high in practice test on AWS Cloud Practitioner certification. Post-practice schedule and appear for exam.\nHurray! You have cracked the AWS Cloud Practitioner certification exam.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post1/ccp-exam/","summary":"Steps to acing the AWS — Certified Cloud Practitioner (CCP) Exam — CLF-C01\nIntroduction I cleared the AWS- Cloud Practitioner Certification Exam back in December 2021 and thought of sharing my the experience and the path I took to complete this. AWS — Certified Cloud Practitioner (CCP) — CLF-C01 is a foundational level AWS certification. I believe that a certificate is only one validation method of existing skills. While preparing for this, I understood how AWS platform works and the basic building blocks behind it.","title":"Steps to acing the AWS — Certified Cloud Practitioner"},{"content":"My Resume Cloud Engineer experienced in building and operating scalable cloud infrastructure in Agile environments. Skilled in AWS architecture, Kubernetes, Terraform, CI/CD automation, monitoring, and incident response, collaborating with cross-functional teams to deliver secure and reliable systems.\nWork Experience Centre for Education and Sustainable Development in Africa Site Reliability Engineer | January 15th 2025 – Present Ensure high availability and optimal performance of the Learning Management System (LMS) through proactive monitoring, incident resolution, and cross-team collaboration. Gained hands-on experience in Agile workflows and SDLC processes. Performed SQL queries to verify data accuracy, reporting integrity, and student engagement metrics. Led the organization wide migration to Google Workspace (GSuite) to enhance communication, reduce operational friction and improve cross functional collaboration. AWS Community Builder Program Community Builder | August 15th 2023 – Present Actively contribute AWS-related content to platforms like Dev.to, Medium, and YouTube, focusing on practical insights and demonstrations of AWS services. Collaborate with a network of AWS Community Builders to tackle complex technical issues and test new AWS Cloud resources. Participate in AWS Kenya User-group events, contributing as a volunteer and session moderator. Public Service Commission (Kenyatta University) Web Developer | April 8th 2024 – Present Skilled in working with popular CMS platforms, including Joomla, WordPress, and Drupal, to develop user-friendly and functional websites. Proficient in PHP, MySQL, JavaScript, and HTML for end-to-end web development and backend integration, enhancing website functionality and security Performed exploratory testing to uncover usability issues and ensure a seamless user experience. Freelance (Fiverr, Upwork) Site Reliability Engineer | December 15th 2022 – Present Architected, deployed, and managed scalable solutions on AWS, Azure, and GCP for clients. Leveraged Terraform and AWS CloudFormation for automated infrastructure deployment. Set up and managed CI/CD pipelines, implementing automated deployment processes and integrating monitoring tools (Prometheus, Grafana). Collaborate with developers, product managers, and support teams to verify bug fixes and improve overall software quality. MENTOR SACCO IT \u0026amp; Quality Support Intern May 2022 to July 2022\nExecuted manual functional testing for Mentor’s mobile applications. Supported system verification, defect logging, and escalation to technical teams. Validated fixes and updates prior to customer rollout. Gained exposure to banking systems, SLAs, and QA standards. Skills Programming languages C Java YAML Python\nTechnical Cloud Services Containerization and Orchestration Networking and Database Management Cloud Security Android Dev Web Development Infrastructure as Code(IaC) tools\nEducation KCA University Nairobi, Kenya Bsc. Information Technology 2018-2022 Grade: Second Class Upper Division\nCertifications Kubernetes and Cloud Native Associate(KCNA) Linux Foundation September 27, 2025 Credential ID: 65c3780590d44b03988150edc6fc19eb\nAWS Certified Solutions Architect - Associate Amazon Web Services Training and Certification December 15, 2021 Credential ID: 65c3780590d44b03988150edc6fc19eb\nGitHub Foundations August 12, 2024 Credential ID VERIFIED\nAWS Cloud Quest: Solutions Architect Amazon Web Services Training and Certification July 04, 2023 Credential ID VERIFIED\nAWS Cloud Quest Cloud Practitioner Amazon Web Services Training and Certification June 18, 2023 Credential ID VERIFIED\nAWS Certified Cloud Practitioner Amazon Web Services Training and Certification December 15, 2021 Credential ID: YH0P5SQBHMEEQ5SR\nAWS Machine Learning Foundations course Udacity October 22, 2021 Credential ID 6PHP6UNN\nAndroid Development Google Developers Group March 16, 2023 Credential ID 036E5A7FB61593D\nCCSK v4.1 Foundation Training Cloud Security Alliance 30 Dec 2022\nCredential ID VERIFIED\nHonors \u0026amp; awards Cyber Shujaa Program: Cloud \u0026amp; Network Security This was a scholarship to study Cloud Computing services, Cloud Security controls and DevSecOps for 4 months with Microsoft Azure guided labs by Dr Paula, Prof Judy and Instructor Keith at United States International University - Africa. (September -December 2022) Cisco®CCNA Introduction to Networks \u0026amp; Routing and Switching Essentials I was awarded two letters of merit signed by Cisco CEO Chuck Robbins which is awarded to Students and Instructors who score 75% or above on their course final exam to acknowledge the excellence they have achieved. Independent Electoral and Boundaries Commission ICT Clerk tasked with Handling the KIEMS kit and confirmation of the voters’ Identification Document in August 2022. Kenya National Bureau of Statistics Enumerator tasked with Going round the Enumeration Block and identifying its boundaries and obtaining accurate answers using CAPI(Computer Assisted Personal Interview) in 2019. ","permalink":"https://manassehmwangi.devatlas.org/experience/","summary":"My Resume Cloud Engineer experienced in building and operating scalable cloud infrastructure in Agile environments. Skilled in AWS architecture, Kubernetes, Terraform, CI/CD automation, monitoring, and incident response, collaborating with cross-functional teams to deliver secure and reliable systems.\nWork Experience Centre for Education and Sustainable Development in Africa Site Reliability Engineer | January 15th 2025 – Present Ensure high availability and optimal performance of the Learning Management System (LMS) through proactive monitoring, incident resolution, and cross-team collaboration.","title":""},{"content":" Steps to acing the AWS — Certified Cloud Practitioner (CCP) Exam — CLF-C01\nIntroduction I recently cleared the AWS- Cloud Quest Cloud Practitioner in June 2023 and thought of sharing my the experience and the path I took to complete this. AWS Cloud Quest is an open-world, role-playing game that teaches you how to create AWS solutions using cloud concepts and exercises based on real life. It helps you learn the intricacies of modern cloud technologies and have fun at the same time. Cloud Quest features 12 real-life scenario assignments which prepare and train you to become an AWS Cloud Practitioner at absolutely no cost. This free game targets audiences that are new to Cloud Computing and people who want to learn the basics of Amazon Web Services (AWS).\nAWS Cloud Quest Essentials Topics Link to the game on Skill Builder AWS Skill Builder The 12 real-life scenario assignment labs include:\nHighly Available Web Applications Networking Concepts File Systems in the Cloud Core Security Concepts Cloud Computing Essentials Connecting VPCs First NoSQL Database Cloud Economics Databases in Practice Auto-Healing and Scaling Architecture Computing Solutions Cloud First Steps In the next sections I\u0026rsquo;ll cover 4 main areas in terms of interface and world (it is a game after all), content and quests (how does the knowledge align with the game), user interaction and performance (how does the system and environment perform) and learning experience (how effective is it in teaching cloud concepts and practice). Followed by an overall summary of the review and recommendation compared to some other AWS free education programs.\nWhen you come to the Solution Center to build the solution and complete the quest, you have 4 steps you need to go through:\nLearn Plan Practice DIY 1: Learn In the learn section you can interact with a diagram of the solution you build and can watch videos of the concepts and services that you will use. I must admit, the diagram was really helpful and gets new to cloud learners used to the concept of solution or architecture diagrams as well as explain what the services are and the way they function. The less great part of this step was the factt that at times the videos took long to load, therefore give it time for the videos to load then watch them. They help one understand the services in the lab and give insights. I highly recommend you watch them even though there might be a glitch or a lag. 2: Plan In the plan section you will use the architecture diagram to creatively come up with the solution for the challenge. It displays the practice lab goals in which this section gives the steps to follow to achieve the goal and the DIY goals where one does this section using the knowledge gained. 3: Practice This is the interactive hands-on area. Here, you will get to implement an initial solution using the actual AWS Management Console itself. It will provide you with an AWS Console session that will expire in 2 hours and 30 minutes. Just as the name of this section implies, this part is mainly for practicing the steps for the final solution that you will implement in the next section. Once you click open AWS console button a new tab will open and you will be directed to the AWS Management Consol homepage. When you are through with the hands on section, this image will be dipslayed showing you\u0026rsquo;ve finished the section. Click on the DIY button to go to the next step. 4: DIY This is the final part of the Solution Center. Here, you apply your own solution based on the learnings that you have acquired in the Practice section. Conclusion AWS made learning Cloud fun to learn and one can learn maybe on lunch breaks or even on your free time. This game gives you a good amount of information about AWS Cloud and is both entertaining and interactive at the same time. Again, this is free to play for everyone and all you need is an Amazon account. I think AWS did a great job in developing the game, making it highly interactive and user-friendly.\n","permalink":"https://manassehmwangi.devatlas.org/blog/post2/quest/","summary":"Steps to acing the AWS — Certified Cloud Practitioner (CCP) Exam — CLF-C01\nIntroduction I recently cleared the AWS- Cloud Quest Cloud Practitioner in June 2023 and thought of sharing my the experience and the path I took to complete this. AWS Cloud Quest is an open-world, role-playing game that teaches you how to create AWS solutions using cloud concepts and exercises based on real life. It helps you learn the intricacies of modern cloud technologies and have fun at the same time.","title":"AWS Cloud Quest: Cloud Practitioner adventure"}]