I will now continue my who-knows infuriatingly fact-free approach to human history, familiar from the previous blog article.

We can’t talk about Emperor Genghis Khan without also mentioning his son Ögedei Khan, who continued to rule and expand his father’s empire with an iron fist.

For his father who only cared about how far a man could ride a pony during the day, a nerdy boy who wasn’t interested in horses must have been a big disappointment. But Ögedei Khan turned out to be a brilliant administrator who, unlike his saber-rattling pony-riding father, rather focused on infrastructure and long-term development.

He understood that the pen could be mightier than the sword and that a thousand words could defeat a thousand soldiers. Unfortunately for Ögedei Khan, he could not read or write a single word himself.

So he had to hire the best scribes in the entire empire to write the book ‘The Secret History of the Mongols’, of which initial motive might have been just to make his dad’s questionable pony adventures look good. Fair enough, you could call it ‘corporate marketing’ these days.

Why Azure AI Foundry?

Times have changed since Ögedei Khan. Nowadays, if you can’t really write and you aren’t any emperor either, no need to worry! You can still write your own success story. You can configure your own AI models in Azure AI Foundry and generate any fairy tale with just a command or two.

Azure AI Foundry is a service that allows you to host your AI models centrally. When you have all your models stored within one service, it makes it easier to integrate a wide range of AI services into your systems. Instead of having to subscribe to a bunch of different services with different and changing API interfaces, Azure AI Foundry provides a unified interface format for all of them.

With Azure AI Foundry, you can host an array of different kinds of AI models, such as text generators, image and video generators, and more. But…

Do I really need all these things?

Sure you do! You see, writing stories has become quite easy with the help of AI. Perhaps even too easy. It means, just writing something does not impress anybody anymore. Your readers will always be wondering whether you just generated your story with ChatGPT and copy-pasted it to your site or document.

To really produce content that brings real value, more is needed.

You have to integrate different kinds of AI models together and combine them in a unique way to make a difference. Sounds hard? Yes, it is hard. It is hard like just writing anything at all was hard for the illiterate Ögedei Khan back in the dark days of the 13th century. So the AI makes old stuff easier but it also raises the bar higher - the old stuff isn’t enough anymore.

Paradoxically enough, the AI can greatly help you learn to use AI - so you can create better output with AI.

Getting started with Azure AI Foundry

Perhaps the easiest way to get started is to log in to the Azure AI foundry portal and deploy a preferred LLM from the model catalog, for example DeepSeek-V3-0324.

When you have configured your first LLM in the portal, write a small client software to make a query to your model and verify it works as expected.

Managing Azure AI Foundry with Bicep

Once you have created your first Azure AI Foundry project in the portal, you may want to learn how to do it with a Bicep script. Taking the scripting approach allows you to achieve consistency and reproducibility in your future deployments and gear up toward an ever accelerating path of process automation. As described in Microsoft’s own tutorial on creating Azure AI Foundry projects with Bicep, a great starting point is to download quite some robust examples:

So let’s get nerdy like Ögedei and use the command line to create a new resource group for your Azure AI Foundry project. I decided to name mine as ‘rg-metamatic-models’ and place it to Sweden. You can name and place yours according to your preferences. Here’s my version:

Then create a Bicep file for your deployment. In that file, first define the AI Foundry basement:

Place your AI project in the AI Foundry AI Foundry service object that you just created:

Finally, let’s use OpenAI’s GPT-4o model as the base workhorse for our future conquests. It’s very good. For the time being, it could be the closest AI equivalent of mori, the all-in-one work “horse” (a pony, really) from the medieval Mongolian steppe.

After configuring your AI models in Azure AI Foundry, you can proceed to build applications that interact with them as part of their processes. You can implement background worker processes that store data in databases hosted in Azure CosmosDB and use AI models to analyze and convert it to whatever format that serves your business needs.

Manage your cluster like Ögedei Khan!