Initiation
The SAP Business Data Cloud (BDC) on the SAP Business Technology Platform (BTP) is currently one of the most discussed topics in the SAP ecosystem. Many companies expect SAP Databricks in particular to provide a user-friendly and scalable platform to unleash data analyses and AI development. Companies whose SAP systems are already running in the cloud in particular benefit from the new Zero-Copy-Approach: A paradigm shift that allows direct access to source data without having to laboriously replicate it into a development platform.
But what can companies do today if the Business Data Cloud has not yet been introduced? What options are there to test the first machine learning and AI use cases in the existing SAP system landscape?
In this article, we highlight two practical scenarios for developing and operationalizing ML solutions based on SAP HANA and SAP AI Core.
SAP HANA Machine Learning
SAP HANA is the in-memory database that is used in modern SAP systems. It offers its own machine learning functionalities via the Predictive Analytics Library (PAL) and Automated Predictive Library (APL) libraries. In both cases, the calculations are carried out directly on the database without the need to export the data.
The Predictive Analytics Library (PAL) is a collection of models and algorithms for machine learning and statistical analysis that are integrated directly into the SAP HANA database. PAL offers over 100 algorithms and models in the areas of classification, regression, clustering, time series, recommender systems, and more.
Similar to PAL, the Automated Predictive Library (APL) also includes models from the areas of classification, regression, clustering and time series. However, APL focuses on automated machine learning. APL automates many machine learning steps, such as data preparation, feature engineering, model selection, optimization, and model evaluation.
Both PAL and APL offer different access options. It is possible to control the libraries and trigger machine learning training processes using SQLScript, for example from SAP HANA Studio. For machine learning developers, however, access options via R (hana.ml.r) or Python (hana-ml) should be more interesting and convenient. These libraries provide client-side functions that control corresponding functions on HANA in PAL or APL. This allows machine learning developers to work in their usual programming language and at the same time use the in-database machine learning functions of PAL and APL.
Figure 1 shows a possible architecture of the ML lifecycle from a developer's perspective. Model training is initiated via the appropriate libraries and trained models can be saved directly in HANA. With the help of SQL procedures, predictions for new data can be generated and then retrieved from business applications.

SAP AI Core
SAP AI Core is a service within the SAP Business Technology Platform (BTP). It enables the systematic execution of machine learning workflows (training and deployment).
Both SAP and cloud providers can be used as data sources. When implementing the architecture, SAP relies heavily on third-party services that are widely used in industry: The core is a Kubernetes cluster, which can be used in a scalable way for training and deployment. With the help of Argo CD, workflows can be deployed on the Kubernetes cluster. AI Core offers connections to Github and Docker repositories for code management and containerization. Within the Docker container, you can work in your preferred programming language. With this approach, you are therefore free to choose machine learning models and not rely on ready-made models. The KServe framework can also be used to provide the Inference Service with REST interfaces.
If AI Core is used with HANA as a data source, the HANA machine learning libraries can also be used. The data then does not have to be transferred and the machine learning workflows are executed directly on HANA.
With the Generative AI Hub, AI Core also offers a solution for deploying generative AI. It provides access to various commercial and open source large language models (LLMs). SAP AI Core can therefore be used for the entire range of machine learning and AI fields.
Figure 2 schematically shows the use of AI Core for training and deploying a machine learning-based solution. Business applications can receive and reuse results via the REST interface.

Conclusion
With HANA Machine Learning and AI Core, SAP offers two different but powerful approaches for developing and operating machine learning and AI within SAP today. With SAP HANA Machine Learning, an approach is offered that works directly on the HANA in-memory database and thus makes data transfers superfluous. This approach is particularly suitable for rapid prototyping or the development and deployment of individual use cases. With AI Core, SAP provides an SAP-centric development and operation platform for machine learning and generative AI. On the basis of a scalable Kubernetes cluster, flexible and more complex developments and deployments can be carried out.
What's next: SAP Databricks
SAP Databricks was announced in February 2025 and has been available on SAP BTP as part of Business Data Cloud since May. The solution promises to provide a powerful platform for data and AI development directly within the SAP landscape.
A central feature is the so-called Zero CopyTechnology that makes it possible to carry out analyses and developments directly on source data. Comprehensive data replication mechanisms, which make data projects significantly more complex and extensive, are therefore no longer required. However, this requires that the SAP systems are already operated in the cloud.
In addition, compared to AI Core, SAP Databricks integrates additional technologies that are established in data science: Data Lakehousearchitecture, scalable computing using Apache Spark, interactive analytics notebooks, and MLflow to manage and track machine learning experiments.
In the next part of the blog series, we'll take a closer look at SAP Databricks and look at architectures, application scenarios, and best practices from practice.