Introduction:
Summer 2019 I visited Microsoft ignite the tour in Amsterdam.
It was my 1st event I visited of Microsoft, hosted in the Rai Amsterdam there were many technical “breakout” sessions to follow. But still it was a small overview of all the Microsoft products & services.In November I had the opportunity to visit Microsoft ignite in Orlando, Florida. The biggest main event of Microsoft. On November 1st I flew to Chicago and got a transfer to Orlando. Upon arrival at Orlando Microsoft had several boots ready at the airport to collect the tickets. After I collected my entree pass I collected my rental car and went straight to the airport. I never visited the US before but watching all the PGA Tour events I noticed many golf professionals have the need to drive a Camero. So I was like… it’s my first time ever let’s make it a special one. So I collected my “Ferarri Red” Camero and went towards the hotel.
As I already collected my entrée pass I still had 2 days to get ready and settled. As the weather was good I playing golf in the early morning at Falcon’s golfclub in Orlando. In the afternoon I driven to Cape Canaveral to watch NASA’s Kennedy Space center. A fantastic looking place, full of technology and actually had the opportunity to talk to some specialist on site. But back to the topic of the article, please read my summary of the days below:

Day 1:
The day has finally come, November 4th 2019, the first day of Microsoft ignite, one of the largest conferences hosted by Microsoft every year. In May earlier the year developers came together in Seattle later in July Microsoft invited partners to Las Vegas for Inspire and now for IT professionals came to Orlando, Florida to get ignited and learn about the current and future Microsoft products & services. I set my alarm at 06:30 to catch the shuttle at 07:00. I wasn’t the only one thinking of taking the early shuttle and shortly after I boarded the bus it was fully stacked quickly. I really wanted to join the Vision Keynote and I heard that you had to come early if you want to be one of the first to get entry. The conference traditionally opens with the Vision Keynote delivered by Satya Nadella (CEO Microsoft). As I described before only three thousand people got to go intro the room to experience him speak in person. And I was one of them! After waiting in a line that stretched for around one kilometer, I entered THE HUB – the central space for the conference. There I found the main stage and several supporting stages where specific products and services announced by Satya were discussed in detail by specialist.
The theme of this year’s Keynote was Tech Intensity. The intensity I experienced every day in our work and makes it both exciting and challenging. Tech intensity as seen by Microsoft is a formula which includes adoption capability and trust. Creating tech capability is something that is engraved deeply into Microsoft mission and was the main focus of Satya’s speech. Topics during the Keynote: Microsoft 365, Dynamics 365, Power Platform, Developer Tools, Trust, Azure.
Azure: Satya started with the very bottom layer, that is Azure. This part was also, by far, the one which featured most updates and announcements.
First on the presentation was the edge, which, next to the cloud, is crucial to creating capable applications. Microsoft’s goal is to create a uniform and a consistent experience which stretches the cloud and the edge for both developers and it professionals. And to do that, they grow the Azure Stack family of products, by adding a new appliance like the Azure Stack Hub and a few smaller form factors including ruggedised ones which fit in a backpack.
Satya then followed with, in my view, one of the more exciting announcements of the day – Azure Arc. Arc is a new service which brings the control and governance capabilities of Azure to on-premises data centres and other cloud providers. The focus, for now, seems to be on Kubernetes and Data, but I will be sure to write a more in-depth piece of this service once I learn more.
Another announcement which caught my attention is Azure Synapse Analytics. Microsofts describes Synapse Analytics as a first-ever service to bring big data, data warehousing and analytics to a single product. Beyond marketing, it is a new version of the Azure SQL Data Warehouse service, but with an addition of: Data pipeline orchestration, Support for Spark clusters, Dedicated, rich tooling, Embedded security, Out-of-the-box integrations with Machine Learning, Simplified PowerBI publishing.
It looks fascinating, and I will be sure to explore this service in depth during the remainder of the week.
Satya also mentioned Automnonus Systems, Project Silica and Azure Quantum. Those, however, were more of high-level teasers rather than announcements of new services which we might use in our daily work.

Higher layers:
The second layer of tech capability is Trust which is a huge focus for Microsoft. Trust does not only mean security, but it is also privacy and trusted AI. For Developer Tools, Microsoft showcase the new Visual Studio Online, which brings the familiar interface of VS code to a web browser. It seems like a fantastic service for those who are on the go a lot, but I spoke to the PMs after the keynote, and it requires a dedicated virtual machine per user. That could potentially make it a rather expensive service, but I’ll be sure to report back once I learn more. The Power Platform, which Microsoft calls a tool for citizen developers, received two exciting updates.

Power Virtual Agents allows users to reach chatbots without having the code, while Power Automate allows for calls to backend APIs or even legacy UI applications. The demo in which Automate filled out a form in a Win32 application based on input from the virtual agent, was very impressive. These tools do put the power of latest innovations into the hands for citizen developers.For Dynamics 365 Satya mentioned that a majority of data stored in the systems is not analysed, and explained how Microsoft aims to change that with build-in data and AI capabilities. I cannot say that I found this part of the keynote very igniting.

I did, however, very much enjoy the presentation of the final layer of the tech capability, that is Microsoft 365.This part focused a lot on Microsoft Teams, which now looks like the to-be central work and collaboration app for anyone working in the ecosystem. It received many exciting updates, which I’ll describe in detail, once we get to test them.What I also liked a lot was Project Cortex – a cross-organisation engine which transforms data into knowledge. It is based on Microsoft Graph, and it builds explorable knowledge networks.Also announced was the first preview of Fluid Framework, a very impressive cross-application, cross-device collaboration technology. I’m incredibly eager to experience it for myself.

Day 2:
The first day of the conference is always devoted to keynotes and theatrical presentations. It is the time when the team from Microsoft announces the new features and services and starts explaining the capabilities with high-level sessions.Day two, however, brings what we all crave for – breakout sessions led by specialists who design and develop the services. Therefore, we all scattered across the gigantic convention centre and dove into the topics which we find valuable for our teams and customers.For me, the main focus of the day was the two new Azure services which Satya Nadella announced on stage yesterday.


Azure Synapse Analytics
I’ve already attended three sessions on Azure Synapse Analytics (ASA) and was lucky enough to chat for a little with the Program Managers working on the service.They describe ASA as an evolution of the Azure SQL Data Warehouse. (SQL DW) and even told me, that once the service becomes generally available, every instance of SQL DW will be automatically updated to an ASA workspace.They were, however, cautious about sharing any hints towards a possible date when that might happen. The service is in preview, and the team wants to gather feedback from users to make sure that the offering is mature before they release it for production.With that in mind, it’s important to stress that this is not another major update with a bit of rebranding. Although the process is an evolution, Synapse Analytics is now much more than a data warehouse.I think the first step is understanding the concept of the ASA workspace. The workspace is the hub for all data related processes beginning from loading, though preparation and analysis, all the way to serving.We can create data pipelines, just like with Azure Data Factory, to orchestrate data ingestion (it offers the same set of 90+ connectors), transform the data either with SQL or Spark jobs, and load it into a warehouse.

But this is also the place where data specialists can connect AI and ML or easily visualise the data with PowerBI.Ease of use, simplicity and the single pane of glass experience is what the team behind this service sees as the fundamental advance.By bringing such extensive functionality under a single roof, they take data out of the traditional silos of big data and data warehouse, thus making it easier to manage, secure and monitor the pipelines.What is also interesting is that the code used for orchestrating data pipelines is the same as for Azure Data Factory. Some ADF features do not and will not exist in ASA, but the two will be compatible. For the time being, we are advised to keep using the combination of Data Factory and Data Warehouse, but when Synapse Analytics becomes available, there will be a seamless migration path.Azure Data Factory, however, will not go away – it will stay as a separate offering targeted at customers with more complex use cases that cannot be handled by ASA.I am very excited to see this service and sincerely hope to use it with our customers. I’m also anxious to discover even more about it in the upcoming Ignite sessions.


Azure Arc
Azure Arc is an incredibly big announcement. The goal of the service is to bring the maturity of Azure’s management, governance and security capabilities to on-premises datacentres and deployments hosted by other cloud providers.By providing such functionality, Microsoft is trying to take a shot at becoming the single management plane for hybrid and multi-cloud environments. Features such as RBAC, audit logging, monitoring, cross-tenant management provided by Azure Lighthouse and governance at scale achieved with Azure Policy, make the offering very compelling. If one also considers the possibility of adding AWS deployments to Azure Cost Management, we can see that Microsoft is slowly entering the market of multi-cloud management tools.In the current form, there are three scenarios supported by Arc, but the team plans on growing the list in the first half of next year. We can currently:

  • Manage servers deployed on-premises and in other clouds
  • Manage Kubernetes clusters deployed to any location
  • Deploy Azure Data Services, such as Azure SQL and Azure PostgreSQL to any Kubernetes cluster.

The first one is already in public preview, while the latter two are still in limited preview, which requires users to sign up and wait to be selected. It’s defiantly worth the wait though.With Servers and K8S clusters, we can view and manage them as we do with Azure-native resources. That means applying policies, managing access, setting tags and using extensions.For Data Services, we also get cloud billing, flexibility and management. The Azure SQL option is an equivalent of SQL Managed Instance, so it should remain functionally compatible with the good old SQL Server. But we also get on-demand scaling, pay-per-use (most probably including licensing), automatic high-availability, cloud backups and of course cloud management.On a technical level, Arc is an extension of the Azure Resource Management (ARM) plane, so we can expect full compatibility with exiting tooling. The team simply created new resource providers which interface with agents for different scenarios.I am genuinely excited beyond recognition about Azure Arc, and I see that I am not the only one. Sessions on this topic are packed full of people, and my colleagues are already setting up a proof-of-concept deployment.

Day 3:

By the third day, the conference is at full swing. We all have our routine already, know the venue a bit and have a decent sense of what sessions we want to follow.The energy is terrific though – I am learning a lot and having an great time here in Orlando.


Besides learning about the newly announced services, one of my goals for this year’s conference was getting a better insight into the monitoring solutions for Azure.I’ve used Azure Monitor, Application Insights and Log Analytics many times, but with the changes to the product family, which Microsoft made over the last year, I felt like I had a few questions to ask. The sessions which I followed so far provided a substantial amount of valuable advice, which I’d like to, at least partially, share.Some of the tips were not technical, but rather from the Site Reliability Engineering domain:

  • Reliability should always be considered from the customers perspective, not the infrastructure perspective
  • Monitoring should include what is running, how well it is running, how well it was running in the past, and the context
  • Define Service Level Indications (SLIs) and Objectives (SLOs) to align on the measurements and what are the expected levels of reliability
  • Align on where to perform specific measurements (VM or load balancer for example)
  • Define your measurements as close to the customer as possible

From a technical perspective, when using Azure Monitor, I was able to note the following recommendations:

  • Use as few Log Analytics Workspaces as possible; they now support Role-Based Access Control, so security should not be an issue any more for most organisations
  • When defining alerts, aim to use metrics, not logs
  • When defining alerts at scale, for example, for hundreds of virtual machines, use dynamic thresholds
  • Use Network Watcher’s Traffic Analytics to gain powerful insights and a map of connectivity
  • To troubleshoot issues with apps, use App Insights’ Application Map. It offers potent tracing capabilities and even allows users to create a Work Item in Azure DevOps (and other tools)
  • Use Workbooks in Azure Monitor to define custom visualisations for metrics and queries
  • Add Resource Graph queries to your dashboards to enrich them with additional operational information
  • Share your dashboards by exporting them to a JSON file

I also attended a session on Microsoft Graph, which, despite working with the technology stack for many years now, I wasn’t familiar with. It might have a similar name, but it is not the same thing as the Resource Graph, which I mentioned above. Microsoft Graph is a unified API surface for working with any of the Microsoft 365 products. That means products like Exchange, SharePoint, Ondedrive, but also Azure Active Directory. It allows developers to create apps which extend or interact with any component of the Microsoft 365 suite of products. Those applications can be both interactive, like for example a custom email client, and non-interactive, such as a provisioning engine.What makes Microsoft Graph especially interesting is that we have an excellent Sofware Development Kit at our disposal. The SDK handles aspects such as authentication, throttling, compression and redirects, and is also available for PowerShell. This means that DevOps engineers who don’t have backend development experience can also use the API to create powerful automation scripts.Serious backend developers now also have a fantastic tool that is the Microsoft Graph Explorer at their disposal. It not only provides examples of queries which can be executed against the API but also provides code snippets for multiple languages, and information regarding the permissions required to run the requests.It is a very exciting product, which might provide a lot of value for teams looking to automate or extend the functionality Microsoft 365.


In between the sessions, I spent some time at the Microsoft Showcase section, looking at the various booths and stands. There, among countless Microsoft Teams presentations, I found what I consider to be a hidden gem of this year’s Ignite – Project Ida. Marketed as “Microsoft News”, it is an “Insights and Discovery Accelerator”. Created initially with news agencies and TV broadcasters in mind, this AI lab sample is developed by a small, but enthusiastic team. The project is still in an early stage of its lifecycle but can already provide exciting functionality to any organisation managing large amounts of information assets.It allows users to import extensive sets of data, such as documents, videos and audio streams and then applies AI to categorise the information. Metadata is generated automatically and organised into graph-like relationships. Users can then search based on select criteria and explore the topic of interest by following related documents, people, topics, events, etc.By using advanced AI, the solution creates digitalised versions of scanned documents, highlights sections matching search criteria, indexes videos and even allows in-browser video editing.The potential value for companies storing large numbers of contracts and customer correspondence or any other kind of archive is genuinely inspiring.The project is a cloud-native solution and is highly extensible. The team even hinted that in the future it might become open-source or even integrated into the broad Microsoft 365 suite of products.


That’s all for today, and it might as well be the last daily report from the convention centre in Orlando. I am staring to the final full day of Ignite, so I’m not sure if I’ll have the time to write down my observations. I will, however, be sure to share some final thoughts a soon as possible.

Day 4: Final
It’s been over a week since I returned from Orlando, and it’s about time I’ve shared my final thoughts. For the first three days, I was able to share updates and observations daily. On Thursday, however, Microsoft invited me for the Ignite celebration, which left no time to write. Day five, from the conference perspective, lasted only until 14:00 and already showed that much of the event was being packed for shipment home. As the event was coming to an end I wanted to watch the last session in person.One of them was especially valuable to me. The BRK3233 “What’s new with Azure Resource Manager (ARM) templates for your deployments” by Brian Moore and Alexander Frankel featured some highly anticipated updates. We had a meeting with the guys and their team earlier in the week, but it was great to see the session and have another chat with them afterwards. The services that the Azure Resource Manager Team develops are the technical foundation of our Azure Landing Zone offering, so we make sure to stay in touch and give feedback.Getting in touch and giving feedback was also one of the dominant feelings of this year’s Ignite. In The Hub, which acted as the exhibition floor of the convention, Microsoft set up a massive showcase space with a booth from, what seemed like, every product team they have. Each stand had program managers present all day long and invited attendees to ask questions, share feedback and connect. It was a genuinely refreshing feeling to be able to have a chat with folks which you typically see in Azure Friday or Microsoft Mechanics videos online.

I was persistent enough, lucky enough, or both, I even had the chance to exchange a few words, or at least get a selfie, with one of the big names such as Jeffrey Snover or Mark Russinovich.These two gents lead me to the next topic of how different of an experience Ignite can be. Overall you can distinguish two types of presentations during the conference:

  • Very popular, maybe not too technical, but inspiring sessions delivered by one of the highly recognisable presenters. These are the sessions which you attend for the speaker, not the content. But rest assured the content will be engaging. It just might not directly contribute to your daily work. After all, how many of us work with quantum computing? Still, names like Mark Russinovich and Jeffrey Snover who I mentioned above, but also Julia White, Sami Laiho, Paula Januszkiewicz, Scott Hanselman and Donovan Brown are considered a “must-see”.
  • Nice, technical, held in smaller and half-packed rooms. There you will learn a lot and dive deep into the wiring of the services which your business relies on. Just be sure to stay afterwards and have a chat, or at least listen in on the question that others have for the program managers. You can fill in any blanks or learn about details which didn’t fit into the tight 45 min time cap. To me, these conversations turned out invaluable.

During one of such spontaneous Q&A sessions, I was lucky to bump into Dean Cefola. If the name doesn’t tell you much, Dean is on the Microsoft Fast Track team for Azure, and he is the guy behind the “Azure Academy” YouTube channel. I find his videos an excellent source of knowledge if I need to quickly get up to speed with a topic I haven’t had much experience with in the past. If you haven’t seen any of them, I highly recommend that you check out the channel.

But the conference wasn’t only working and networking. I also had some fun, and the Ignite Celebration was terrific. Microsoft fully rented two Universal Studios amusement parks for the conference attendees and gave us free access from late afternoon until midnight. Food and drinks were served, and we had an absolute blast with the rollercoasters and other attractions. The evening also reminded us how big the conference was — despite entry to the parks being exclusive for the private event, it was still very crowded. Waiting time for the more popular rides got as high as 90 min. I hope to return to Orlando next year to once again learn a lot and enjoy the vibe, the weather and the fun. At the time of writing, however, the location of Ignite 2020 was not announced yet. Rumors have it that the conference might return to Chicago McCormick Place after five years. If that turns out to be the case, those who were there in 2015 will hope for a better experience than the first time around.

LEAVE A REPLY

Please enter your comment!
Please enter your name here