Join the DZone community and get the full member experience.
Join For Free
Generative AI is transforming how modern applications are built, deployed, and scaled. Developers can now run text summarization within the database using a simple SQL command.
Azure Database for PostgreSQL provides an ability to integrate directly with Azure AI services so that the developers have the power to directly use GenAI capabilities into their own data layer, streamlining workflow, automating the insights, and providing next-level user experiences.
Azure AI Extension for PostgreSQL
The Azure AI extension for PostgreSQL is a powerful add-on that enables developers to bring AI capabilities directly into their PostgreSQL databases hosted on Azure.
This extension has the ability to easily incorporate the advanced AI services of Azure, including OpenAI, Cognitive Services, and Azure Machine Learning, into your data workflows without the necessity to transfer your information out of your database environment.
Core Functionality
The core of the extension lies in dedicated schemas, which point to various Azure AI services. The AI capabilities are modularized using these schemas that help developers to focus on specific services.
Benefits
By extending PostgreSQL with Azure AI, you can:
* Keep data secure and in place: Perform AI operations without moving data outside the database.
* Simplify development: Use familiar SQL to call AI models and services, reducing the need for custom application logic.
* Scale intelligently: Leverage Azure's managed AI infrastructure alongside a fully managed database.
* Accelerate innovation: Quickly build intelligent features -- like chatbots, content generation, or smart recommendations -- using integrated AI functions.
Configuration and Enablement (Simplified)
To get started, follow these basic steps:
Azure OpenAI Integration
The Azure AI OpenAI schema of the extension Azure AI facilitates the generation and usage of embedding vectors of PostgreSQL, dense numerical representations of the text that represent the semantic meaning of it.
Using these embeddings, your applications can go beyond a basic keyword search and engage in semantic search, similarity search, and retrieval-augmented generation (RAG).
What Are Embeddings?
The meaning of text is represented with the help of embeddings, which are high-dimensional vectors. They expose the real meaning behind the words by taking relationships and the context of words and phrases.
Closer vectors in the vector space are generated using similar texts.
They enable your system to realize that there is a relationship between words such as dog and puppy, although there may be differences between them.
This is what makes semantic search, recommendations, and content grouping even smarter and precise.
Minimal SQL Example: Creating Embeddings
To generate embeddings using a deployed Azure OpenAI model:
* 'text-model' refers to the name of your deployed embedding model (e.g., text-embedding-ada-002).
* 'sample text' is the input string to be vectorized.
The result is a vector of floating-point numbers representing the semantic content.
Using Embeddings for Semantic Search
Once stored, embeddings let you compare text based on meaning, not just keywords. This enables use cases like:
* Semantic search: Find content similar in meaning to a user's query.
* Similarity comparison: Group related documents, articles, or messages.
* Recommendations: Suggest relevant content to users based on context.
To store and query embeddings efficiently, it's best to combine this integration with pgvector. It's a PostgreSQL extension for handling vector data types. With pgvector, you can store embeddings in vector columns and run fast similarity searches using SQL.
Together, the azure_openai schema and pgvector turn your PostgreSQL database into a powerful semantic search engine, bringing Generative AI closer to your data.
Azure Cognitive Services Integration
The Azure Cognitive Services schema in the Azure AI extension brings powerful prebuilt AI models directly into your PostgreSQL database. With this integration, you can run tasks like sentiment analysis, language detection, text summarization, and entity recognition right alongside your data.
Azure Cognitive Services provides ready-to-use models for common natural language tasks:
* Sentiment analysis: Understand whether text expresses positive, negative, or neutral sentiment.
* Language detection: Automatically detect the language of any text input.
* Text summarization: Generate concise summaries from longer pieces of text.
* Entity recognition: Extract important entities such as names, dates, and locations from text.
How It Works in SQL
You can call Cognitive Services functions directly from SQL queries. For example, to analyze the sentiment of a short text:
* 'Great product' is the input text.
* 'en' specifies the language (English).
* The result includes sentiment classification and confidence scores.
Other functions follow a similar pattern:
Storing cognitive analysis in your PostgreSQL database within your database allows you to have more control and ease your AI processes. Rather than copy data into other services and work with complicated pipelines, you can enrich and pipe text without leaving its home. Such a strategy has a few benefits:
* Better data security: The information containing sensitive details remains within your database and there is less possibility of it being breached upon during the transfers of data.
* Reduced latency: Execution of cognitive tasks on-premises implies that the response time is lower, and that is essential when using in a real-time scenario.
* Easy architecture: You do not need to create and support additional integration tier, external storage systems.
* Reduced AI applications: It is easier to integrate smart functionality such as feedback analysis, content moderation, smart tagging, and text classification into your applications.
* Improved scalability: 2 Scaling up is a breeze when you process high text volumes with cognitive functions local to your data.
Azure Machine Learning Integration
The Azure ML schema within the Azure AI extension enables PostgreSQL to invoke custom machine learning models hosted on Azure Machine Learning. This integration allows developers to perform real-time inference directly from SQL, streamlining workflows and reducing architectural complexity.
Here's a minimal example of how to call a deployed model:
The input_data object contains the features expected by your model. The result is a JSONB object with the model's prediction or output.
Flexible JSON-Based Input/Output
This approach supports a wide range of ML scenarios:
* Only users with the azure_ai_settings_manager role can configure endpoints and keys.
* Use Azure Key Vault for secure credential management and rotation.
* Ensure your model endpoint is deployed and accessible via REST API.
Practical Use Cases
Let's see some of the use cases:
Sentiment Analysis and Opinion Mining
Sentiment analysis enables you to have a quick idea of whether customer feedback is negative, positive, or neutral. Having this run directly within your database implies that you can store a ton of reviews, comments, or even survey answers without the need for additional tools.
Use it to:
* Customer insight: Identify satisfaction drivers and pain points
* Brand monitoring: Track public perception across channels
* Decision support: Inform product development and marketing strategies
Database-Level Analysis
By performing sentiment analysis directly within Azure Database for PostgreSQL, businesses benefit from:
* Scalability: Analyze large volumes of feedback without external pipelines
* Efficiency: Reduce latency by keeping data and analytics co-located
* Security: Maintain governance and access control within the database
Applications
* Feedback monitoring: Analyze NPS surveys, app reviews, and support logs
* Content personalization: Tailor messaging based on user sentiment
* Customer service triage: Prioritize tickets based on emotional urgency
Semantic Similarity Search
There is the traditional keyword search that matches exact words, but it may not search for the actual meaning of a query. This is overcome by semantic similarity search, which encapsulates the context and meaning of text in a multi-dimensional space through embedding vectors.
And should you wish, your embeddings can also be compared to efficient vector operations, such as cosine similarity or inner product, using the PGvector extension in PostgreSQL. This will enable you to rank the results according to closeness of meaning to the query and not necessarily by words used.
Practical examples include:
* The provision of more intelligent product search in e-commerce, in locating other products with the same description or reviews.
* Creating recommendation engines which can be used to suggest related articles, documents, or products by meaning.
* This semantic knowledge base or FAQ ignition can power knowledge bases or FAQs and give people the most relevant answers, even when they are vague or differently phrased.
Final Thoughts
The involvement of generative AI is a significant step towards the design and implementation of intelligent applications, which is integrated into the Azure Database PostgreSQL. Partnering with open source and SQL platforms such as PostgreSQL, the database has provided developers with the ability to implement AI features side-by-side.
It works through extensions with Azure OpenAI, Azure Cognitive Services, and Azure Machine Learning, providing them with an opportunity to scale massive workflows incorporating semantic comprehension, prompt inferencing, and scale-out analytics.
This approach helps streamline application development by:
* Reducing architectural complexity
* Improving data locality and overall performance
* Providing secure, SQL-native access to advanced AI services
When you implement these capabilities, you should consider whether GenAI is suitable for your use case or not. It does not always mean that we need AI in the data layer; the performance depends on the data volume, latency requirements, and model complexity.
Completely avoid weighing one against the other and consider the technical feasibility and business value in a simultaneous manner, so that your solution brings the desired impact on a large scale.