Google Cloud Next '22 Recap: Serverless & Data editionDaniel Zivkovic
See what's new in #Serverless and #Data at GCP. Our guest, Guillaume Blaquiere - Stack Overflow contributor & #GCP #Developer Expert from France, covered the best #GoogleCloudNext announcements, practically demoed how to benefit from #BigQuery Remote Functions and answered many questions.
The meetup recording with TOC for easy navigation is at https://youtu.be/AuZZTwHIcdY
P.S. For more interactive lectures like this, go to http://youtube.serverlesstoronto.org/ or sign up for our upcoming live events at https://www.meetup.com/Serverless-Toronto/events/
3 reasons to pick a time series platform for monitoring dev ops driven contai...DevOps.com
In this webinar, Navdeep Sidhu, Head of Product Marketing at InfluxData, will review why you should use a Time Series Database (TSDB) for your important times series data and not one of the traditional datastore you may have used in the past. Join us to learn why you should consider implementing a new monitoring strategy as you upgrade your application architecture.
Slide deck from BrightGen's webinar on the new features provided by the Salesforce Winter 23 release. Presented by Keir Bowden, CTO, in 12th October 2022.
This deck and webinar covers the features that we believe are of most interest to our customers and thus does not represent the entire release.
View the webinar recording at : https://youtu.be/G_WYKYgp5f4
apidays LIVE Jakarta - Building an Event-Driven Architecture by Harin Honesty...apidays
apidays LIVE Jakarta 2021 - Accelerating Digitisation
February 24, 2021
Building an Event-Driven Architecture
Harin Honestyandi Parandika, Microservice and Middleware Designer at XL Axiata
This document discusses data replication and Informatica's data replication solution. It defines data replication as automating the cloning of thousands of application tables in real-time while managing transaction data capture, routing, and delivery. Informatica's data replication provides continuous availability during upgrades, reduces IT costs by offloading to lower cost systems, and enables uninterrupted migrations. It replicates transactional changes between source and target systems with high extraction and apply speeds. The solution benefits data warehouses, real-time reporting, migrations, and auditing requirements.
App modernization projects are hard. Enterprises are looking to cloud-native platforms like Pivotal Cloud Foundry to run their applications, but they’re worried about the risks inherent to any replatforming effort.
Fortunately, several repeatable patterns of successful incremental migration have emerged.
In this webcast, Google Cloud’s Prithpal Bhogill and Pivotal’s Shaun Anderson will discuss best practices for app modernization and securely and seamlessly routing traffic between legacy stacks and Pivotal Cloud Foundry.
The document summarizes the Packaging Repository application, which centrally manages packaging for automotive components at RENAULT. It is currently developed in Java-J2EE but the goal is to migrate it to Salesforce's cloud platform. The summary discusses:
1) The application allows for creating and managing packaging codes, characteristics, and documents from 5 origins. It has different user roles for administration, validation, coordination, and viewing.
2) Packaging goes through statuses of draft, under study, and validated as part of its lifecycle managed by administrators and validators.
3) The application architecture follows Apex design patterns like separation of concerns (SOC) with domain, service, and controller layers to
Best Practices for Streaming IoT Data with MQTT and Apache KafkaKai Wähner
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges. In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
We use HiveMQ as open source MQTT broker to ingest data from IoT devices, ingest the data in real time into an Apache Kafka cluster for preprocessing (using Kafka Streams / KSQL), and model training + inference (using TensorFlow 2.0 and its TensorFlow I/O Kafka plugin).
We leverage additional enterprise components from HiveMQ and Confluent to allow easy operations, scalability and monitoring.
Viele Autos, noch mehr Daten: IoT-Daten-Streaming mit MQTT & Kafka (Kai Waehn...confluent
This document discusses best practices for streaming IoT data with MQTT and Apache Kafka. It begins with an overview of a use case involving a global automotive company building a connected car infrastructure. An architecture is presented showing how sensor data from cars can be ingested via MQTT into Apache Kafka and then processed using tools like Kafka Streams, TensorFlow, and ElasticSearch for analytics and alerts. A live demo is described that implements this full pipeline. The document concludes with a discussion of best practices around choosing the right tools, separation of concerns, data types, and next steps.
Data & Analytics Forum: Moving Telcos to Real TimeSingleStore
MemSQL is a real-time database that allows users to simultaneously ingest, serve, and analyze streaming data and transactions. It is an in-memory distributed relational database that supports SQL, key-value, documents, and geospatial queries. MemSQL provides real-time analytics capabilities through Streamliner, which allows one-click deployment of Apache Spark for real-time data pipelines and analytics without batch processing. It is available in free community and paid enterprise editions with support and additional features.
HashiConf '19
Explaining how we use Inversion of Control at Criteo to create very effective types of services
https://hashiconf.hashicorp.com/schedule/inversion-of-control-with-consul
Use Case for Financial Industry using Mule ESB. This is a unique project and use case that shows, using light weight ESB like Mule it is easy to adapt and scale out on utility hardware. Besides just scale out, it is easy to migrate from a legacy batch based applications into a work flow enabled, Active-Active applications.
VMworld 2013: Moving Enterprise Application Dev/Test to VMware’s Internal Pri...VMworld
This document discusses moving an enterprise application development and testing environment to VMware's internal private cloud. Key points:
- The AppOps team previously managed development environments manually, which was slow, unreliable, and reduced developer productivity.
- VMware chose to replace the traditional infrastructure with a private cloud based on VMware's Software Defined Data Center (SDDC) and automate the provisioning process.
- With automation and the private cloud, process time dropped from 4 weeks to 36 hours, developer productivity increased 20% or more, and annual infrastructure costs reduced by $6 million.
- Lessons learned include separating the automation build team from operations, taking a phased approach, and using integrated cloud
5 Reasons DevOps Toolchain Needs Time-Series Based MonitoringDevOps.com
Monolithic architectures are being replaced by microservices-driven apps and the cloud- based infrastructure is being tied together and instrumented by DevOps processes. This is driving the need for greater visibility and better monitoring. Legacy monitoring solutions fail to deliver the much needed sub-second visibility. Let’s take a look at Time-Series platforms and how they are delivering the level of visibility and monitoring needed by today’s DevOps initiatives.
In this webinar, we will take a look at Time-Series Data Platforms and outline how InfluxData’s leading Time-Series data platform can deliver the next-gen monitoring for your DevOps projects.
VMware Cloud Management Platform (CMP) provides tools to manage private, public and hybrid clouds. It enables IT organizations to act as brokers of IT services through a software-defined data center that provides automation, operations management and business insights. CMP includes vRealize Automation for automated provisioning and lifecycle management, vRealize Operations for visibility and optimization, and integrates with NSX and vSphere for network and security virtualization. This allows organizations to more efficiently deliver IT services, plan capacity, and provide transparency into costs.
IBM Blockchain Platform - Architectural Good Practices v1.0Matt Lucas
This document discusses architectural good practices for blockchains and Hyperledger Fabric performance. It provides an overview of key concepts like transaction processing in Fabric and performance metrics. It also covers optimizing different parts of the Fabric network like client applications, peers, ordering service, and chaincode. The document recommends using tools like Hyperledger Caliper and custom test harnesses for performance testing and monitoring Fabric deployments. It highlights lessons learned from real projects around reusing connections and load balancing requests.
Abdul Rahman J has over 6 years of experience developing on the T24 banking software. He has expertise in areas like T24 core modules, API development, template programming, and testing. Some of his projects include componentization of T24, adding US date format support, and module upgrades. He is proficient in languages like Java and C/C++ and tools like Eclipse IDE. He aims to provide quality deliverables through adherence to SDLC processes and testing best practices.
Intel IT Open Cloud - What's under the Hood and How do we Drive it?Odinot Stanislas
L'IT d'Intel fait sa révolution et s'impose d'agir comme un "Cloud Service Provider". La transformation est initiée avec au programme la mise en place d'un Cloud Fédéré, Interopérable et Open mais aussi d'un framework de maturité, du DevOps et de la prise de risque. Bref, vraiment intéressant
Serena Release Management approach and solutionsSoftmart
Kev Holmes is a British computer scientist with over 30 years of experience in software development tools and processes. The document discusses the increased complexity of software deployments over time due to factors like larger teams, virtualization, and offshore development. It introduces concepts like continuous integration/deployment and DevOps to help manage this complexity. Serena Release Automation is presented as a solution that can model deployment structures, support multiple deployment processes, define target environments, act as a system of record, integrate with other tools, and provide reporting to automate the end-to-end software deployment process.
This document introduces a self-service metadata driven data loading platform developed by Walmart to simplify and optimize the process of onboarding and running data applications. The key components of the platform include a centralized metadata store, connectors to integrate various data sources and targets, an orchestrator to build optimized execution plans, a schedule optimizer to prioritize jobs, and telemetry dashboards for monitoring. The goal of the platform is to dramatically increase developer productivity, provide a low-code experience, and intelligently manage resources and job scheduling across applications.
Mdop session from Microsoft partner boot campOlav Tvedt
This document summarizes Advanced Group Policy Management (AGPM), a tool that enhances group policy management in Microsoft environments. AGPM provides versioning, history, and rollback of group policy changes. It enables change management workflows and role-based administration with delegation controls. Customers report that AGPM gives them better control over group policies and reduces downtime from misconfigured policies. The architecture involves a server component that stores backups of group policy objects and an administrative client.
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Importance of ‘Centralized Event collection’ and BigData platform for Analysis !Piyush Kumar
The document discusses the importance of centralized event collection and analysis using a big data platform. It describes the challenges faced by MakeMyTrip in analyzing huge amounts of data from various sources. Centralized logging of structured event data from all systems and applications is recommended to enable effective log analysis, troubleshooting, and personalizing the user experience. A data service platform is needed to integrate data from different sources and power real-time and batch processing for analytics and insights.
Special Meetup Edition - TDX Bengaluru Meetup #52.pptxshyamraj55
We’re bringing the TDX energy to our community with 2 power-packed sessions:
🛠️ Workshop: MuleSoft for Agentforce
Explore the new version of our hands-on workshop featuring the latest Topic Center and API Catalog updates.
📄 Talk: Power Up Document Processing
Dive into smart automation with MuleSoft IDP, NLP, and Einstein AI for intelligent document workflows.
App modernization projects are hard. Enterprises are looking to cloud-native platforms like Pivotal Cloud Foundry to run their applications, but they’re worried about the risks inherent to any replatforming effort.
Fortunately, several repeatable patterns of successful incremental migration have emerged.
In this webcast, Google Cloud’s Prithpal Bhogill and Pivotal’s Shaun Anderson will discuss best practices for app modernization and securely and seamlessly routing traffic between legacy stacks and Pivotal Cloud Foundry.
The document summarizes the Packaging Repository application, which centrally manages packaging for automotive components at RENAULT. It is currently developed in Java-J2EE but the goal is to migrate it to Salesforce's cloud platform. The summary discusses:
1) The application allows for creating and managing packaging codes, characteristics, and documents from 5 origins. It has different user roles for administration, validation, coordination, and viewing.
2) Packaging goes through statuses of draft, under study, and validated as part of its lifecycle managed by administrators and validators.
3) The application architecture follows Apex design patterns like separation of concerns (SOC) with domain, service, and controller layers to
Best Practices for Streaming IoT Data with MQTT and Apache KafkaKai Wähner
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges. In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
We use HiveMQ as open source MQTT broker to ingest data from IoT devices, ingest the data in real time into an Apache Kafka cluster for preprocessing (using Kafka Streams / KSQL), and model training + inference (using TensorFlow 2.0 and its TensorFlow I/O Kafka plugin).
We leverage additional enterprise components from HiveMQ and Confluent to allow easy operations, scalability and monitoring.
Viele Autos, noch mehr Daten: IoT-Daten-Streaming mit MQTT & Kafka (Kai Waehn...confluent
This document discusses best practices for streaming IoT data with MQTT and Apache Kafka. It begins with an overview of a use case involving a global automotive company building a connected car infrastructure. An architecture is presented showing how sensor data from cars can be ingested via MQTT into Apache Kafka and then processed using tools like Kafka Streams, TensorFlow, and ElasticSearch for analytics and alerts. A live demo is described that implements this full pipeline. The document concludes with a discussion of best practices around choosing the right tools, separation of concerns, data types, and next steps.
Data & Analytics Forum: Moving Telcos to Real TimeSingleStore
MemSQL is a real-time database that allows users to simultaneously ingest, serve, and analyze streaming data and transactions. It is an in-memory distributed relational database that supports SQL, key-value, documents, and geospatial queries. MemSQL provides real-time analytics capabilities through Streamliner, which allows one-click deployment of Apache Spark for real-time data pipelines and analytics without batch processing. It is available in free community and paid enterprise editions with support and additional features.
HashiConf '19
Explaining how we use Inversion of Control at Criteo to create very effective types of services
https://hashiconf.hashicorp.com/schedule/inversion-of-control-with-consul
Use Case for Financial Industry using Mule ESB. This is a unique project and use case that shows, using light weight ESB like Mule it is easy to adapt and scale out on utility hardware. Besides just scale out, it is easy to migrate from a legacy batch based applications into a work flow enabled, Active-Active applications.
VMworld 2013: Moving Enterprise Application Dev/Test to VMware’s Internal Pri...VMworld
This document discusses moving an enterprise application development and testing environment to VMware's internal private cloud. Key points:
- The AppOps team previously managed development environments manually, which was slow, unreliable, and reduced developer productivity.
- VMware chose to replace the traditional infrastructure with a private cloud based on VMware's Software Defined Data Center (SDDC) and automate the provisioning process.
- With automation and the private cloud, process time dropped from 4 weeks to 36 hours, developer productivity increased 20% or more, and annual infrastructure costs reduced by $6 million.
- Lessons learned include separating the automation build team from operations, taking a phased approach, and using integrated cloud
5 Reasons DevOps Toolchain Needs Time-Series Based MonitoringDevOps.com
Monolithic architectures are being replaced by microservices-driven apps and the cloud- based infrastructure is being tied together and instrumented by DevOps processes. This is driving the need for greater visibility and better monitoring. Legacy monitoring solutions fail to deliver the much needed sub-second visibility. Let’s take a look at Time-Series platforms and how they are delivering the level of visibility and monitoring needed by today’s DevOps initiatives.
In this webinar, we will take a look at Time-Series Data Platforms and outline how InfluxData’s leading Time-Series data platform can deliver the next-gen monitoring for your DevOps projects.
VMware Cloud Management Platform (CMP) provides tools to manage private, public and hybrid clouds. It enables IT organizations to act as brokers of IT services through a software-defined data center that provides automation, operations management and business insights. CMP includes vRealize Automation for automated provisioning and lifecycle management, vRealize Operations for visibility and optimization, and integrates with NSX and vSphere for network and security virtualization. This allows organizations to more efficiently deliver IT services, plan capacity, and provide transparency into costs.
IBM Blockchain Platform - Architectural Good Practices v1.0Matt Lucas
This document discusses architectural good practices for blockchains and Hyperledger Fabric performance. It provides an overview of key concepts like transaction processing in Fabric and performance metrics. It also covers optimizing different parts of the Fabric network like client applications, peers, ordering service, and chaincode. The document recommends using tools like Hyperledger Caliper and custom test harnesses for performance testing and monitoring Fabric deployments. It highlights lessons learned from real projects around reusing connections and load balancing requests.
Abdul Rahman J has over 6 years of experience developing on the T24 banking software. He has expertise in areas like T24 core modules, API development, template programming, and testing. Some of his projects include componentization of T24, adding US date format support, and module upgrades. He is proficient in languages like Java and C/C++ and tools like Eclipse IDE. He aims to provide quality deliverables through adherence to SDLC processes and testing best practices.
Intel IT Open Cloud - What's under the Hood and How do we Drive it?Odinot Stanislas
L'IT d'Intel fait sa révolution et s'impose d'agir comme un "Cloud Service Provider". La transformation est initiée avec au programme la mise en place d'un Cloud Fédéré, Interopérable et Open mais aussi d'un framework de maturité, du DevOps et de la prise de risque. Bref, vraiment intéressant
Serena Release Management approach and solutionsSoftmart
Kev Holmes is a British computer scientist with over 30 years of experience in software development tools and processes. The document discusses the increased complexity of software deployments over time due to factors like larger teams, virtualization, and offshore development. It introduces concepts like continuous integration/deployment and DevOps to help manage this complexity. Serena Release Automation is presented as a solution that can model deployment structures, support multiple deployment processes, define target environments, act as a system of record, integrate with other tools, and provide reporting to automate the end-to-end software deployment process.
This document introduces a self-service metadata driven data loading platform developed by Walmart to simplify and optimize the process of onboarding and running data applications. The key components of the platform include a centralized metadata store, connectors to integrate various data sources and targets, an orchestrator to build optimized execution plans, a schedule optimizer to prioritize jobs, and telemetry dashboards for monitoring. The goal of the platform is to dramatically increase developer productivity, provide a low-code experience, and intelligently manage resources and job scheduling across applications.
Mdop session from Microsoft partner boot campOlav Tvedt
This document summarizes Advanced Group Policy Management (AGPM), a tool that enhances group policy management in Microsoft environments. AGPM provides versioning, history, and rollback of group policy changes. It enables change management workflows and role-based administration with delegation controls. Customers report that AGPM gives them better control over group policies and reduces downtime from misconfigured policies. The architecture involves a server component that stores backups of group policy objects and an administrative client.
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Importance of ‘Centralized Event collection’ and BigData platform for Analysis !Piyush Kumar
The document discusses the importance of centralized event collection and analysis using a big data platform. It describes the challenges faced by MakeMyTrip in analyzing huge amounts of data from various sources. Centralized logging of structured event data from all systems and applications is recommended to enable effective log analysis, troubleshooting, and personalizing the user experience. A data service platform is needed to integrate data from different sources and power real-time and batch processing for analytics and insights.
Special Meetup Edition - TDX Bengaluru Meetup #52.pptxshyamraj55
We’re bringing the TDX energy to our community with 2 power-packed sessions:
🛠️ Workshop: MuleSoft for Agentforce
Explore the new version of our hands-on workshop featuring the latest Topic Center and API Catalog updates.
📄 Talk: Power Up Document Processing
Dive into smart automation with MuleSoft IDP, NLP, and Einstein AI for intelligent document workflows.
Understanding Traditional AI with Custom Vision & MuleSoft.pptxshyamraj55
Understanding Traditional AI with Custom Vision & MuleSoft.pptx | ### Slide Deck Description:
This presentation features Atul, a Senior Solution Architect at NTT DATA, sharing his journey into traditional AI using Azure's Custom Vision tool. He discusses how AI mimics human thinking and reasoning, differentiates between predictive and generative AI, and demonstrates a real-world use case. The session covers the step-by-step process of creating and training an AI model for image classification and object detection—specifically, an ad display that adapts based on the viewer's gender. Atulavan highlights the ease of implementation without deep software or programming expertise. The presentation concludes with a Q&A session addressing technical and privacy concerns.
Dreamforce Tour: MuleSoft Meets AI: IDP for Modern Enterprisesshyamraj55
This transcript captures insights from a MuleSoft meetup focused on integrating MuleSoft with AI and Intelligent Document Processing (IDP) for modern enterprises. Co-hosted by the Bangalore and Mysore meetup groups, the event featured speakers Pranav and Priya, who discussed the evolution and applications of AI, along with the importance of responsible AI usage. They showcased how MuleSoft connects data sources with AI models to enhance enterprise solutions. Priya demonstrated IDP's role in automating invoice processing and explored its future potential with Einstein AI. The session wrapped up with a Q&A, addressing queries on IDP implementation and best practices.
Global Exception Handling Custom Error Connector In MuleSoftshyamraj55
Global Exception Handling Custom Error Connector In MuleSoft | Bangalore MuleSoft Meetup #43
This presentation covers a technical discussion on error handling and custom model development using MuleSoft, a platform for building application networks. It outlines key topics such as error-handling strategies, building custom models, and implementing global exception handling. The slides include a demo on creating custom XML SDKs and emphasize the importance of robust exception management for large-scale applications. Additionally, it explores the process of developing and publishing custom connectors to the MuleSoft Exchange, focusing on version control and addressing production error challenges.
Integrating Kafka with MuleSoft 4 and usecaseshyamraj55
In this slides, the speaker shares their experiences in the IT industry, focusing on the integration of Apache Kafka with MuleSoft. They start by providing an overview of Kafka, detailing its pub-sub model, its ability to handle large volumes of data, and its role in real-time data pipelines and analytics. The speaker then explains Kafka's architecture, covering topics such as partitions, producers, consumers, brokers, and replication.
The discussion moves on to Kafka connector operations within MuleSoft, including publish, consume, commit, and seek, which are demonstrated in a practical demo. The speaker also emphasizes important design considerations like connector configuration, flow design, topic management, consumer group management, offset management, and logging. The session wraps up with a Q&A segment where various Kafka-related queries are addressed.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Oauth 2.0 Introduction and Flows with MuleSoftshyamraj55
Learn about the basics of OAuth 2.0 and the different OAuth flows in this introductory video. Understand how OAuth works and the various authorization mechanisms involved.
ServiceNow Integration with MuleSoft.pptxshyamraj55
- The document outlines an agenda for a Patna MuleSoft Meetup on integrating ServiceNow with MuleSoft.
- The agenda includes an overview of ServiceNow, a demonstration of the ServiceNow connector in MuleSoft, and time for Q&A and networking.
- The speaker, Vandana Gouda, will introduce ServiceNow and how to setup a developer account and instance. She will then demonstrate how to use the ServiceNow connector in MuleSoft to create and retrieve incidents in a ServiceNow instance.
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
This document contains an agenda and summaries of presentations for a MuleSoft meetup event on Women Who APAC/Bangalore. The agenda includes introductions, presentations on Anypoint Code Builder, Google Pub/Sub Connector, and RPA. Speakers will discuss Code Builder features and a demo, use cases and a demo of Google Pub/Sub and the connector, and the RPA lifecycle and components with a demo. Time is allocated at the end for questions and networking.
How to release an Open Source Dataweave Libraryshyamraj55
The document summarizes a talk given by Ryan Hoegg on releasing open source Dataweave libraries. The talk covered creating reusable Dataweave transformation logic available as a Maven dependency, considerations for open source licensing including permissive and copyleft options, and tips for making libraries release-ready with strong typing, unit tests and annotations. It concluded with Ryan sharing his experience in open sourcing libraries and ways for attendees to get involved in the MuleSoft community.
Unleash the Solace Pub Sub connector | Banaglore MuleSoft Meetup #31shyamraj55
This document provides an overview and introduction to the Solace PubSub+ Connector for Mulesoft. It discusses the benefits of using the native Solace connector over a generic JMS connector, including specific configuration for Solace and ability to import event schemas. Key features of the connector are listed, such as publishing, consuming, requesting, and acknowledging messages. Compatibility and dependency information is also presented. The document concludes with an announcement for an upcoming meetup on developing Open Source Dataweave libraries.
Munit In Mule 4 | Patna MuleSoft Meetup #26shyamraj55
The document summarizes a meetup about Munit testing in Mule 4. It discusses what Munit is, how it can be used to test Mule applications, and provides a demo of key Munit features like mocking processors, assertions, and verifying calls. The meetup agenda includes introductions, an overview of Munit capabilities, and a Q&A session. The speaker is a senior MuleSoft consultant with experience in Munit and automated testing.
An overview of Anypoint API Community Managershyamraj55
This document summarizes a meetup about Anypoint Community Manager hosted by the Patna MuleSoft Meetup Group. The meetup included an overview of Anypoint Community Manager, a demonstration of creating a community using its default template and setting up user profiles and permissions. It discussed how Anypoint Community Manager can be used to create customized developer portals and engage API developers through features like forums, documentation and case management. The meetup encouraged participants to provide feedback and suggestions to improve future meetups.
This document provides an agenda for a MuleSoft meetup on cryptography in MuleSoft. The agenda includes an introduction, overview of cryptography concepts, demonstrations of cryptography functionality in MuleSoft like encryption, decryption, signatures, and a Q&A session. Attendees are asked to introduce themselves and provide their name, company, location, and MuleSoft experience. The meetup speaker is then introduced.
Role of Data Annotation Services in AI-Powered ManufacturingAndrew Leo
From predictive maintenance to robotic automation, AI is driving the future of manufacturing. But without high-quality annotated data, even the smartest models fall short.
Discover how data annotation services are powering accuracy, safety, and efficiency in AI-driven manufacturing systems.
Precision in data labeling = Precision on the production floor.
AI and Data Privacy in 2025: Global TrendsInData Labs
In this infographic, we explore how businesses can implement effective governance frameworks to address AI data privacy. Understanding it is crucial for developing effective strategies that ensure compliance, safeguard customer trust, and leverage AI responsibly. Equip yourself with insights that can drive informed decision-making and position your organization for success in the future of data privacy.
This infographic contains:
-AI and data privacy: Key findings
-Statistics on AI data privacy in the today’s world
-Tips on how to overcome data privacy challenges
-Benefits of AI data security investments.
Keep up-to-date on how AI is reshaping privacy standards and what this entails for both individuals and organizations.
Quantum Computing Quick Research Guide by Arthur MorganArthur Morgan
This is a Quick Research Guide (QRG).
QRGs include the following:
- A brief, high-level overview of the QRG topic.
- A milestone timeline for the QRG topic.
- Links to various free online resource materials to provide a deeper dive into the QRG topic.
- Conclusion and a recommendation for at least two books available in the SJPL system on the QRG topic.
QRGs planned for the series:
- Artificial Intelligence QRG
- Quantum Computing QRG
- Big Data Analytics QRG
- Spacecraft Guidance, Navigation & Control QRG (coming 2026)
- UK Home Computing & The Birth of ARM QRG (coming 2027)
Any questions or comments?
- Please contact Arthur Morgan at [email protected].
100% human made.
Artificial Intelligence is providing benefits in many areas of work within the heritage sector, from image analysis, to ideas generation, and new research tools. However, it is more critical than ever for people, with analogue intelligence, to ensure the integrity and ethical use of AI. Including real people can improve the use of AI by identifying potential biases, cross-checking results, refining workflows, and providing contextual relevance to AI-driven results.
News about the impact of AI often paints a rosy picture. In practice, there are many potential pitfalls. This presentation discusses these issues and looks at the role of analogue intelligence and analogue interfaces in providing the best results to our audiences. How do we deal with factually incorrect results? How do we get content generated that better reflects the diversity of our communities? What roles are there for physical, in-person experiences in the digital world?
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptxAnoop Ashok
In today's fast-paced retail environment, efficiency is key. Every minute counts, and every penny matters. One tool that can significantly boost your store's efficiency is a well-executed planogram. These visual merchandising blueprints not only enhance store layouts but also save time and money in the process.
This is the keynote of the Into the Box conference, highlighting the release of the BoxLang JVM language, its key enhancements, and its vision for the future.
Spark is a powerhouse for large datasets, but when it comes to smaller data workloads, its overhead can sometimes slow things down. What if you could achieve high performance and efficiency without the need for Spark?
At S&P Global Commodity Insights, having a complete view of global energy and commodities markets enables customers to make data-driven decisions with confidence and create long-term, sustainable value. 🌍
Explore delta-rs + CDC and how these open-source innovations power lightweight, high-performance data applications beyond Spark! 🚀
Mobile App Development Company in Saudi ArabiaSteve Jonas
EmizenTech is a globally recognized software development company, proudly serving businesses since 2013. With over 11+ years of industry experience and a team of 200+ skilled professionals, we have successfully delivered 1200+ projects across various sectors. As a leading Mobile App Development Company In Saudi Arabia we offer end-to-end solutions for iOS, Android, and cross-platform applications. Our apps are known for their user-friendly interfaces, scalability, high performance, and strong security features. We tailor each mobile application to meet the unique needs of different industries, ensuring a seamless user experience. EmizenTech is committed to turning your vision into a powerful digital product that drives growth, innovation, and long-term success in the competitive mobile landscape of Saudi Arabia.
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...Impelsys Inc.
Impelsys provided a robust testing solution, leveraging a risk-based and requirement-mapped approach to validate ICU Connect and CritiXpert. A well-defined test suite was developed to assess data communication, clinical data collection, transformation, and visualization across integrated devices.
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager APIUiPathCommunity
Join this UiPath Community Berlin meetup to explore the Orchestrator API, Swagger interface, and the Test Manager API. Learn how to leverage these tools to streamline automation, enhance testing, and integrate more efficiently with UiPath. Perfect for developers, testers, and automation enthusiasts!
📕 Agenda
Welcome & Introductions
Orchestrator API Overview
Exploring the Swagger Interface
Test Manager API Highlights
Streamlining Automation & Testing with APIs (Demo)
Q&A and Open Discussion
Perfect for developers, testers, and automation enthusiasts!
👉 Join our UiPath Community Berlin chapter: https://community.uipath.com/berlin/
This session streamed live on April 29, 2025, 18:00 CET.
Check out all our upcoming UiPath Community sessions at https://community.uipath.com/events/.
Semantic Cultivators : The Critical Future Role to Enable AIartmondano
By 2026, AI agents will consume 10x more enterprise data than humans, but with none of the contextual understanding that prevents catastrophic misinterpretations.
TrsLabs - Fintech Product & Business ConsultingTrs Labs
Hybrid Growth Mandate Model with TrsLabs
Strategic Investments, Inorganic Growth, Business Model Pivoting are critical activities that business don't do/change everyday. In cases like this, it may benefit your business to choose a temporary external consultant.
An unbiased plan driven by clearcut deliverables, market dynamics and without the influence of your internal office equations empower business leaders to make right choices.
Getting things done within a budget within a timeframe is key to Growing Business - No matter whether you are a start-up or a big company
Talk to us & Unlock the competitive advantage
4. Agenda
● Introduction
● What is EDA
● Understanding Salesforce Streaming Events & Configuration
● Salesforce Connector
● Why Cockroach DB
● Setting up CockroachDB
● Integrating Salesforce replay channel & Cockroach DB in Mule
● Demo
5. Event Driven Architecture
Event-driven architecture enables systems to react dynamically to changes rather than
relying on direct interactions. Events represent real-world occurrences like order
placements or payments, allowing asynchronous communication between applications.
Key Benefits:
● Real-time processing: Immediate response to changes.
● Decoupling: Reduces dependencies between services.
● Scalability: Handles high event volumes efficiently.
8. Cockroach DB
What is Cockroach DB & Key features
CockroachDB is a cloud-based database that stores data across multiple locations to provide quick
access. It's designed to be scalable, fault-tolerant, and highly available.
Features:
● Distributed Architecture
● Automatic Replication
● Transactional Consistency
● Self healing & Recovery
9. SQL Layer
Transaction Layer
Distribution Layer
Replication Layer
Storage Layer
SQL Layer
Transaction Layer
Distribution Layer
Replication Layer
Storage Layer
Node Node
Pebble KV Store
Write to a key in
key-range 2
Range 2 leader
Range 2 follower
10. Salesforce Connector
• Salesforce is the leading SaaS provider with its flagship CRM platform, available exclusively online.
• It helps businesses connect to their customers in a whole new way, so they can find more prospects, close more
deals, and wow customers with amazing service.
• Anypoint Connector for Salesforce (Salesforce Connector) enables you to create apps that react to Salesforce
events such as adding, changing, or deleting objects, topics, and channels.
11. Salesforce Replay Channel Listener
• This operation can be used to subscribe to salesforce channel/topic events.
• One more advantage of using replay topic/channel listener is it can can continue from the last replay ID they
received before restarting the application.
• Earlier versions of connector before 10.14 have only below four options for REPLAY ID at the connector level.
• 1) ALL
• 2) ONLY_NEW
• 3) FROM_REPLAY_ID
• 4)FROM_LAST_REPLAY_ID
12. -1 : " ONLY_NEW “
• Listens only to new events generated after the subscription starts.
• Ideal for real-time streaming.
• Example Use Case:
• A live dashboard showing only new order updates from Salesforce.
• Real-time fraud detection system processing transactions as they happen.
• All the events generated within 24 or 72 hr retention period will be
consumed.
• Ideal for loading historical data.
• Example Use Case:
To receive historical data for audit logs/analytics
-2 : “ ALL_EVENTS “
13. From_Replay_ID
• Resumes from a specific Replay ID that you store.
• Used for guaranteed event processing in case of failures.
• Example Use Case:
• A MuleSoft application storing the last processed Replay ID in Object Store. If the app restarts, it resumes from where it left off.
• Ensures no duplicate processing and no missed events.
From_Last_Replay_ID
• For the first time deployment connector searches for replay id in object store as it cant get any value it starts with
replay id as -2.
• We need to select Replay Option as FROM_LAST_REPLAY_ID and enable ObjectStoreV2 to enable autoreplay
feature.
• Example Use Case:
• A Mule application crashes and restarts. You want to process all events from the last 24 hours.
• A new subscriber needs historical data.