Upgrade to Pro — share decks privately, control downloads, hide ads and more …

AI for the Enterprise

Aletheia
September 24, 2024

AI for the Enterprise

Aletheia

September 24, 2024
Tweet

More Decks by Aletheia

Other Decks in Technology

Transcript

  1. The landscape of AI is vast and continually evolving, with

    various subfields offering specialized applications. Awareness of this landscape is essential for professionals across different disciplines who wish to harness AI's capabilities effectively. Artificial Intelligence: The science of building machines capable of reasoning similar to humans or even exceed them. Machine Learning: The backbone of modern AI, machine learning algorithms allow computers to learn from data. Techniques range from supervised to unsupervised learning, with deep learning becoming increasingly popular. Deep Learning: A subfield of machine learning where a particular class of algorithms named Neural Networks is involved, Natural Language Processing (NLP): A subfield focused on the interaction between computers and human languages. Examples include machine translation, chatbots, and sentiment analysis. Computer Vision: Enables machines to interpret visual information from the world. Key applications include facial recognition, medical image analysis, and autonomous vehicles. 01. AI LANDSCAPE SOTTOTITOLO SLIDE The AI Landscape 4
  2. In this phase, the business problem is framed as a

    machine learning problem: what is observed and what should be predicted (known as a label or target variable). Determining what to predict and how performance and error metrics need to be optimized is a key step in ML. For example, imagine a scenario where a manufacturing company wants to identify which products will maximize profits. Reaching this business goal partially depends on determining the right number of products to produce. In this scenario, you want to predict the future sales of the product, based on past and current sales. Predicting future sales becomes the problem to solve, and using ML is one approach that can be used to solve it. 8 02. TRANSFORMERS Real-Life Machine Learning Workflow
  3. • Define criteria for a successful outcome of the project

    • Establish an observable and quantifiable performance metric for the project, such as accuracy, prediction latency, or minimizing inventory value • Formulate the ML question in terms of inputs, desired outputs, and the performance metric to be optimized • Evaluate whether ML is a feasible and appropriate approach • Create a data sourcing and data annotation objective, and a strategy to achieve it • Start with a simple model that is easy to interpret, and which makes debugging more manageable 9 02. TRANSFORMERS ML problem framing
  4. 17 Neosperience AI Technologies 02. TRANSFORMERS Simulate logical reasoning Understand

    and produce text in defined contexts Learn from experience: • Patterns • Predictive “Rules” (Declaro) “Prediction” “Language” Experts Systems Machine Learning LLM
  5. 18 Neosperience AI Offering 02. TRANSFORMERS Experts Systems Machine Learning

    LLM Product configuration with business and technical constraints LLM integration platform MLOps platform • On premise • In cloud • Rule creation • Custom UX • System Integration • Data Engineering • Algorithms • Prompt Engineering • System Integration
  6. 19 Neosperience AI Target Users 02. TRANSFORMERS Experts Systems Machine

    Learning LLM • Sales Team (B2B) • Marketing LLM integration platform • Domain expert • Process expert • Content producer expert • Production • Customers Clustering • Safety & Security
  7. 20 Neosperience AI Use Cases 02. TRANSFORMERS Experts Systems Machine

    Learning LLM • Budgeting of standard products • Complex systems configuration • CPQ • Product configuration • Custom DEM • RFQ recognition (Ringmill) • Form Reading (Systhema) • Technical Chatbot (BPS) • Image Visual Search • Avatar (Royaland) • Q&A (Morgan) • Travel assistant (Trenord) • Document Classification • Search of similar projects • Anomaly Prediction • Process Latency Prediction • Video Analysis for marketing • Video Analysis for safety
  8. “The failure rate on AI projects has been between 83%

    and 92%” — Fortune.com https://fortune.com/2022/07/26/a-i- success-business-sense-aible-sengupta
  9. “Successful AI initiatives require a good understanding of AI projects

    lifecycle” — Forbes.com https://www.forbes.com/sites/cognitiveworld/2022/08/14/ the-one-practice-that-is-separating-the-ai-successes-from- the-failures/?sh=6df5b30e17cb
  10. GenerativeAI Projects Lifecycle PROJECT SCOPE DEFINITION MODEL SELECTION ADAPTATION &

    ALIGNMENT APPLICATION INTEGRATION DEPLOY Shape the use case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. Select the optimal Foundation Model (FM) to be used, based on available data, supported languages and regulatory constraints. Adopt techniques to make models adapt to solve specific task. Evaluate model fine- tuning opportunity to increase model specificity to languages and tasks. Evaluate model alignment to further customize tone of voice, enforce guardrails, and prevent hallucinations. Integrate models with external data sources to provide up-to-date or real-time responses, overcome context constraints, and call APIs. Implement reasoning and acting accordingly to improve autonomous interactions. Define deployment targets and hardware constraints. Perform model optimization to balance precision and required computing power. Exploit SaaS/Cloud/on- premise alternatives to address company constraints and budget.
  11. Generative AI Projects Lifecycle PROJECT SCOPE DEFINITION Shape the use

    case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. 06. PROJECT LIFECYCLE
  12. Generative AI Projects Lifecycle PROJECT SCOPE DEFINITION MODEL SELECTION Shape

    the use case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. Select the optimal Foundation Model (FM) to be used, based on available data, supported languages and regulatory constraints. 06. PROJECT LIFECYCLE
  13. Generative AI Projects Lifecycle PROJECT SCOPE DEFINITION MODEL SELECTION ADAPTATION

    & ALIGNMENT Shape the use case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. Select the optimal Foundation Model (FM) to be used, based on available data, supported languages and regulatory constraints. Adopt techniques to make models adapt to solve specific task. Evaluate model fine- tuning opportunity to increase model specificity to languages and tasks. Evaluate model alignment to further customize tone of voice, enforce guardrails, and prevent hallucinations. 06. PROJECT LIFECYCLE
  14. Generative AI Projects Lifecycle PROJECT SCOPE DEFINITION MODEL SELECTION ADAPTATION

    & ALIGNMENT APPLICATION INTEGRATION Shape the use case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. Select the optimal Foundation Model (FM) to be used, based on available data, supported languages and regulatory constraints. Adopt techniques to make models adapt to solve specific task. Evaluate model fine- tuning opportunity to increase model specificity to languages and tasks. Evaluate model alignment to further customize tone of voice, enforce guardrails, and prevent hallucinations. Integrate models with external data sources to provide up-to-date or real-time responses, overcome context constraints, and call APIs. Implement reasoning and acting accordingly to improve autonomous interactions. 06. PROJECT LIFECYCLE
  15. Generative AI Projects Lifecycle PROJECT SCOPE DEFINITION MODEL SELECTION ADAPTATION

    & ALIGNMENT APPLICATION INTEGRATION DEPLOY Shape the use case defining the task the project is expected to resolve. Select the interaction interface to be exposed to users. Define KPIs and constraints for the solution to be acceptable. Define overall project running budget. Select the optimal Foundation Model (FM) to be used, based on available data, supported languages and regulatory constraints. Adopt techniques to make models adapt to solve specific task. Evaluate model fine- tuning opportunity to increase model specificity to languages and tasks. Evaluate model alignment to further customize tone of voice, enforce guardrails, and prevent hallucinations. Integrate models with external data sources to provide up-to-date or real-time responses, overcome context constraints, and call APIs. Implement reasoning and acting accordingly to improve autonomous interactions. Define deployment targets and hardware constraints. Perform model optimization to balance precision and required computing power. Exploit SaaS/Cloud/on- premise alternatives to address company constraints and budget. 06. PROJECT LIFECYCLE
  16. 31 Incremental Projects Lifecycle (IPL) Sometimes requirements are unclear and

    project scope cannot be defined before a working prototype is built. In such cases, an incremental approach is preferable because offers the customer a understanding over the direction the solution is heading, while keeping budget in check. The Generative AI Project Lifecycle can be grouped into three phases, aimed to showcase the feasibility and match business requirements. A Proof-of-Concept (PoC) is the initial phase, where requirements and project scope need to be properly defined. In this phase also model capabilities are matched against customer requirement and a baseline showcasing the expected result is shown. Sometimes, due to the uncertainty of the environment and the continuous development of the technology, the PoC phase is switched to a Research and Development (R&D) phase which allow for better management of of uncertainty within a constrained effort. In the Minimum Viable Product (MVP) phase the model performances are tailored to production requirements and the main features of the solution are developed. The release phase accounts all the integration features, GUIs and deployments needed to support scalability and reliability. MVP RELEASE POC / R&D PROJECT SCOPE DEFINITION MODEL SELECTION ADAPTATION & ALIGNMENT APPLICATION INTEGRATION DEPLOY 06. PROJECT LIFECYCLE
  17. 32 IPL — Phases Phase Description Outcome Target Users Research

    and Development (R&D) Starts with project kick-off and covers all the solution design process, requirements mapping, models evaluation, selection and initial prompt engineering. Usually alternative to PoC phase. • R&D Report • Specific tests / PoCs • Internal users • Stakeholders Proof-of-Concept (PoC) Starts with project kick-off and covers all the solution design process, requirements mapping, models evaluation, selection and initial prompt engineering. • Solution project • Critical path definition • Budget estimation • Working prototype in sandbox environment or on exported data • Internal users • Stakeholders • Project team Minimum Viable Product (MVP) Starts when PoC is approved. Has the goal to fine-tune models and prompts. Eventual model alignment using RLHF. Iterates multiple times through Evaluation and engineering sub-phases. Then integrations with systems providing data are built. • Viable product implementing requested features working on customer data • Critical path implementation • Integrations with customer systems • Production ready • Stakeholders • End users Release Aims to scale the MVP towards the customer base, accounting for reliability and high availability. • Released full-feature solution • Customer Training (optional) • End users • General audience 06. PROJECT LIFECYCLE
  18. IL FUTURO DELL’ARTIGIANATO UN NUOVO MONDO Machine Learning 34 AI:

    UNA PANORAMICA Case Study Machine Learning to streamline design processes. Speed up feasibility analysis activities and related costing of projects. "How can we reuse what we have already designed, to respond to the new customer request?"
  19. IL FUTURO DELL’ARTIGIANATO UN NUOVO MONDO Machine Learning 35 AI:

    UNA PANORAMICA Case Study Detect existing data on the system, analyze the data, detect anomalies and create correlations between machine failures and events tracked in the company. • How is my car doing? • Does it consume too much? • How often do I have failures on the machines and on which components? • What actions on the machine are related to a likely failure? • Are there correlations between requests for work on the machines, oil consumption, and the work done on the machine?
  20. IL FUTURO DELL’ARTIGIANATO UN NUOVO MONDO Computer Vision Consente alle

    macchine di interpretare le informazioni visive del mondo. Le applicazioni principali includono il riconoscimento facciale, l'analisi delle immagini mediche e i veicoli a guida autonoma. Computer Vision 36 Esempi Quality Control: Utilizzo di computer vision per ispezionare prodotti durante il processo di produzione, identificando difetti, imperfezioni o deviazioni dagli standard di qualità, garantendo che solo i prodotti conformi vengano spediti ai clienti. Riconoscimento di oggetti: Utilizzo di algoritmi di computer vision per identificare e classificare oggetti all'interno di immagini o video, come riconoscere auto, persone o animali in un feed video. Tracciamento degli oggetti: Utilizzo di computer vision per monitorare e seguire il movimento di oggetti in un video, come il tracciamento di veicoli in movimento o persone in un'area di sorveglianza. Sicurezza: Sistemi di computer vision per il riconoscimento facciale e il monitoraggio degli accessi ai locali sensibili, rilevando intrusioni e comportamenti sospetti in tempo reale, migliorando la sicurezza degli edifici e delle aree riservate. AI: UNA PANORAMICA
  21. PeopleAnalytics implements state-of-the-art (SOTA) algorithms. • Object recognition identifies objects

    within a video, drawing a relevant bounding box around them. It then classifies objects according to a standard list. • Object/People Tracking: track objects or people's movements within a video, drawing a trajectory. • Behavior Detection: identifies people interacting with objects within a video. Our platform o&ers these ready-to-use state-of-the-art models that represent the core capabilities of any video detection application. PERSON 0.98 PERSON 0.92 PERSON 0.78 PERSON 0.64 State-of-the-Art in Applied AI
  22. Standard safety use cases Use case specific model training Extensive

    training with dedicated dataset builds models suitable for a variety of real-life use cases: • Vehicle Tracking • People tracking • PPE detection • Object holding detection • Man on the ground detection Our platform o%ers these ready-to-use proprietary models, the result of many years of working with top customers in a variety of industrial applications.
  23. to plant specific use cases Models adaptation and alignment Our

    platform comprises specialized models for plant safety, trained on specific safety use cases, to detect • Safety equipment • Dangerous behaviors • Plant specific equipment These proprietary models are available within the PeopleAnalytics product. They are trained on standard equipment but can be fine-tuned to improve detection accuracy further. HELMET 0.98 MASK 0.97 AIR TANK 0.92 RADIO 0.96 METER 0.88 GLOVES 0.78 GLOVES 0.78
  24. Face Scrambling Model Privacy Preservation For privacy sensible use cases,

    we provide our proprietary face scrambling model, a deep learning model running on the edge camera, which anonymizes frames in compliance with GPDR. The face scrambling model requires specific cameras to support the deep learning model deployment or use an external edge coprocessor. This ensures that no image containing PII is ever stored on camera or in any remote company storage, thus preserving privacy by design.
  25. Dashboard PeopleAnalytics • PPE Configuration (enable/disable) • Textual alarms fired

    every detection • Visual detection on video streams coming from multiple cameras • Worker Digital Twins to detect health parameters and proactively ensure safety
  26. Applied AI Selected Success Stories • PeopleAnalytics has been deployed

    in several selected use cases (mostly under NDA) • Adoption ranges from security to safety on a wide set of applications. • Models can be fine-tuned to target customer-specific equipment, plant configuration or event lighting conditions, thus improving accuracy. • Common use cases require our open platform approach allowing ease of integration with existing video and plant management systems
  27. Amazon PartyRock https://partyrock.aws/ VinAbbinamento https://partyrock.aws/u/aletheia/ie48JkCDO/ VinAbbinamenti Product Inspired Messaging Generator

    https://partyrock.aws/u/aletheia/YOpdxGRtF/ Product-Inspired-Messaging-Generator Product Inspired Messaging Generator https://partyrock.aws/u/aletheia/YOpdxGRtF/ Product-Inspired-Messaging-Generator
  28. Thank You. 25125 BRESCIA, VIA ORZINUOVI, 20 20137 MILANO, VIA

    PRIVATA DECEMVIRI, 20 WWW.NEOSPERIENCE.COM