This year is a fifth in a row we have attended Microsoft Ignite conference (wiki). Ignite is biggest Microsoft conference that visits about 30 000 professionals, with more than 1000 sessions and 200+ hands on experiences. In this blog post I will share most interesting take aways.
- Azure Arc: unifying managing of cloud and onPrem data centers. Imagine deploying resources with the same ease as in cloud with the same powerful tools such as security insights, policies etc in our data centers.
- Azure Synapse: structure&unstructured data storage and analysing tools under one hood. Enterprise grade. 10000 concurrent users without major performance degradation or service outages.
- Project Silica: Microsoft is working on a durable way of storing data by writing it to a glass. Satya had a small glass sample with Super Man movie written in it…cool!
- Azure Quantum: quantum computing over “normal” servers so basically anyone can start writing programs in “quantum language” Q# and eventually run those over quantum computers some day. Everything is open source!
- Visual Studio Online: spin up a personalized development environment (your own isolated virtual machine) with web browser visual studio with ability to debug and provide remote access to e.g. site running on that machine.
- Power Virtual Agents&Automate: a showcase how easily we can build a chat bot with ability to create records in a legacy application. Without coding. Oriflame is already using the previous generation in Mexico for a bot-chat solution.
- Project Cortex: Artificial intelligence inside the organization. Should leverage data and provide context and knowledge above it. Imagine searching fir “sick day” in Bing which returns not only public results but also our company policies. Imagine receiving email about “PDP”. Clicking on the word an automatically built knowledge base will pop up describing it’s Product Detail Page, what team/ colleagues are working on this feature etc.
- Microsoft Edge: a new generation of Microsoft web browser. Switched to chromium engine, installed independently on OS. Legacy internet explorer runs seamlessly “inside”. Seems like the right resurrection. Will you give Edge another try?
- Microsoft Endpoint Manager: bringing together Intune and System center with the ability to ease management of endpoint devices with the ability not only to detect potential vulnerabilities but also mitigate these with applying fixes nearly instantly
- HoloLens 2 begins to be shipped to customers (3500 USD/device or 99 USD/month subscription for developers edition).
Visual Studio Enhancements
- IntelliCode: based on AI and analysis of thousands of open source projects and the context of the code you are writing provides code completion to promote best practices. It won’t just provide you names of properties and function names with a basic documentation. It will try to predict what you are about to achieve providing method overloads and parameter guessing…more info https://visualstudio.microsoft.com/services/intellicode/
- Xml documentation: now with basic markup such as bold/emphasis etc also in the intellisense sneak peak window. Finally an <inheritdoc/> tag is here = no need to re-comment inherited/overloaded methods
- Unit tests: got a new player, live unit testing (only under enterprise edition) and a hierarchical filtering based on namespace…useful in case you have tens, hundreds or even thousands of unit tests
- Live share: imagine you want to share your visual studio..like because of code review, pair programming or because of help with debugging: now this is possible via simple link that will provide your colleague the ability to see and edit your *live* code, debugging session, your environment…
- Console finally inside the VS IDE
- Easy development for linux (e.g. for Node.JS): just open a ubuntu command line, download git repo, run a vscode from the line: it runs both a vs code server in the linux and vs code instance on your local windows machine, using live share, so you are able to develop via vs code inside a linux subsystem e.g. over a node.js server
Visual Studio Online
- Rapidly speed up the environment preparation and overall productivity
- In matter of minutes you are able to spin up you dedicated virtual machine with prepared environment and start coding
- Using “Live share” technology = imagine Visual Studio as an front end for a server running on a remote machine = builds, unit tests, intellisence/code, debuging…everything is running remotely, only data (=your updated code, outputs etc) are transferred. Lightning fast.
- Costs unknown at the moment, definitely payment for the virtual machines that is spinning, maybe more because of VS license. Should be possible to connect with VS Code.
Microservices and rapid development
- Development of microservices (like via containers in Azure Kubernetes) is something that Microsoft is aiming at a lot. Imagine you are developing one of the service among many. How would you test integration of your service? How would you mock data provided by other microservices?
- Azure Dev Spaces is the answer for this. It is a rapid, iterative Kubernetes development experience for teams in Azure Kubernetes Service (AKS) clusters. You can collaborate with your team in a shared AKS cluster, test your version of the service before a commit, deploy version of the service on the pull request to run integration tests etc. ..more info https://docs.microsoft.com/en-us/azure/dev-spaces/about
Cloud native apps: OAM, DAPR
- Microsoft is trying to standardize the way how we describe application architecture (components) and the deployment model (environments, services) via Open application model initiative. It isolates responsibilities for developer (describes how the application looks like), application operator (describes how the application will be deployed, e.g. how would DEV, UAT, STG environment looks like = configuration) and infrastructure operator (responsible for the underlying platform providing services such as data storage etc).
- In fact this should standardize multi cloud deployments
- Another initiative is a Distributed Application Runtime (DAPR). It should be the “next big thing” when speaking about how to run and operate microservices in cloud. It’s event-driven, portable runtime for building microservices on cloud and edge:
- Dapr enables developers using any language or framework to easily write microservices, providing industry best practices to solve distributed systems problems.
- Dapr provides consistency and portability through open APIs and extensible components that are community-driven.
- Dapr handles state, resource bindings and pub/sub messaging, which enable event-driven, resilient architectures that scale.
- Dapr is platform agnostic and runs on any infrastructure, including public clouds and edge devices with its open APIs.
- On AKS Iit works a way that every pod with application get’s a DAPR sidecar with a localhost endpoints providing all the infrastructure services such as state, pub/sub messaging etc with standardized re-try, circuit breaker etc patterns
Overall feeling from the conference so far is that Microsoft is pushing a lot IoT and “edge” computing. You can deploy your code (functions, apps etc), AI/ML analysis, even SQL Server closer to your end users, whenever it is a azure stack, box, edge or a raspberry pi device.
Want to be a truly serverless? Azure has the answer: simply run your website via Azure storage Static website hosting feature, have some Frontdoor in front, auto-scalable CosmosDb as data and Azure functions as compute while having frontend written as Single Page Application (like we do with React) and you have a truly auto scalable solution with pay as usage growth model. Of course with monitoring via App Insights all the way long. More info https://aka.ms/theurlist and the app itself:
And by the way, Azure Functions Premium plan is now generally available: pre-warmup, up to 4 cores/14GB RAM instances that can scale up to 100 units and more: https://azure.microsoft.com/en-us/updates/azure-functions-premium-plan-is-now-generally-available/
Set of tools including power automate, power BI, power Apps, Power Virtual agents…basically all you need for advance business “power” users to create set of cool “nocode” solutions for their needs. One important fact: it’s all targeting enterprise customers. While having Office 365 license a user can use some of the features (run canvas apps, standard data connectors, use Flow/Power Automate) some of the features (model driven apps, premium/custom connectors, data service, custom portals…) are available only for standalone/extended licensing plans. If you want to deep dive in the platform, navigate to this session: https://myignite.techcommunity.microsoft.com/sessions/83518. Licensing: https://docs.microsoft.com/en-us/power-platform/admin/pricing-billing-skus
One cool new feature not related to SQL Server directly but used heavily in the sessions was Azure Data Studio (cross-platform database tool for data professionals of on-premises and cloud data platforms on Windows, MacOS, and Linux, see https://docs.microsoft.com/en-us/sql/azure-data-studio/what-is).
Microsoft is focusing not only on cloud compute and data storage engines, but on all variety of use cases e.g. SQL Edge that can run even on Raspberry Pi. SQL Big data clusters on the other hand allow you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes. These components are running side by side to enable you to read, write, and process big data from Transact-SQL or Spark, allowing you to easily combine and analyze your high-value relational data with high-volume big data (https://docs.microsoft.com/en-us/sql/big-data-cluster/big-data-cluster-overview?view=sql-server-ver15).
Accelerated data recovery: another cool feature especially when running large transactions over millions of rows: cancelling the transaction is instantaneous in comparison with experience so far. Another use case is when there is interruption in the service (service restart/power interruption) , then the recovery of the database after the crash could take minutes to hours when the database is not available. Now it could be done within seconds. More information how to enable it here: https://www.sqlshack.com/accelerated-database-recovery-instant-rollback-and-database-recovery/
It’s already happening and Microsoft wants to be on top. Few highlights:
- Microsoft is building a quantum computing foundation. Providing SDKs & language to build quantum algorithm (Q# code, algorithm)
- Think of quantum as a specialized quantum co-processor of the classical PC to accelerate specific tasks e.g. quantum random number generator.
- Quantum teleportation: future way how to share secrets (keys) and provide safe way to communicate
- Bono is a quantum circuit visualizer that allows you to edit quantum circuits and evaluate them at real-time. It also generates Q# code that you can take and run on Microsoft Quantum Development Kit and in the future, a real quantum computer: https://github.com/microsoft/bono
- Estimated ~2030 quantum computing would break current crypto algorithms. By that new Post-quantum signatures etc. needs to be already available. http://openquantumsafe.org
Session recordings (75m), expert (400+) level here: https://myignite.techcommunity.microsoft.com/sessions/82019
.NET Core 3.1
- gRPC protocol native support (modern open source remote procedure call framework over http/2 with the ability to bi-directional streaming)
- Worker services: long running apps like queue workers etc. (windows: services, linux: systemd)
- Long-Term Support (LTS)
- WPF and Windows Forms support
- Side-by-side support & self-contained EXEs
- Full-stack web development with C# and Razor
- New C# 8.0 language features like switch expressions, nullable parameter for C# projects (e.g. variable needs to be explicitly specified as nullable if a null is assigned), “using” declarations (no need for blocks), asynchronous streams etc, see https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-8
- ASP.NET Core Blazor 3: full stack web development with C# runs in all browsers + web assembly (native performance), release in May 2020
- Support for IoT (Raspberry Pi etc)
- Future? .NET 5 sneak peak, expected Nov 2020
Inside Azure Datacenters
- In total already 52 regions available (+2 coming soon)
- Comprehensive information about network architecture and current data centers “looks”
- Project Natik: ocean submerged mini module with several racks and hundreds of servers as a PoC (announced last year)
- Project Natik 2: mini region of 12 such modules
- Micron Scuti-o SSD disks now up to 9GB/s throughput and 8us latency
- New NVv4 VMs with fractional GPUs
- Liquid cooling testing
- Private endpoint for public services such as Azure SQL databases within our virtual network
- Not only that we can close public endpoints for Blob Storage, SQL etc but we can also close access to public internet from within the virtual machine thus limiting greatly the attack surface
- The pricing is just about 4 EUR/month per endpoint + €0.005 per in/out GB data transferred
- Another great feature how to improve security
- Imagine you have completely disconnected server (no public/internal endpoints). Thanks to Bastion you are still able to connect via Remote desktop or ssh to the server from within the Azure portal
- Another cool thing is that all sessions can be recorded (optional feature)
- You can use Bastion to join also other devices such as servers behind firewall to Azure portal
- Price is €0.161 per hour of usage
Azure Storage highlights
- So far Azure Storage (Blobs etc) were provided as clusters (limits were like 30 000 operations/s)
- Now Microsoft implemented something called cluster groups enabling to sharing capacity among clusters thus providing truly unlimited bandwidth and storage.
- And more, cluster (groups) can be now geo-replicated too
- You can even specify policy to replicate just subset of data etc
- Indexing for storage and “tagging” blobs (like taxonomy of blobs) was announced
- Another cool feature is Azure Blob Quick Query with SQL like language
- Shared disks were announced. So far only one VM can have a lease for writing to underlying disk. Now this can be shared on multiple VM and in case of failover the other VM can lease the write lock
The name is inspired from the eight-legged microscopic creature, the tardigrade also known as the water bear. Those can survive nearly everything and so should virtual machine running in the Cloud. How do Microsoft wants to achieve this? When an software error occurs in the host system it will take full dump of the system state for later analysis and then it tries to recover from the failure either by restarting the services or even the whole hyper-v sub-system to a healthy state. During the operation customers VM is stopped for a brief time like second(s) and then re-run by the healthy host again.
More information here: https://azure.microsoft.com/en-gb/blog/improving-azure-virtual-machines-resiliency-with-project-tardigrade/
Few other highlights
- ATLAS: multitenant, serverless container platform for container based Azure services (internal tool)
- Pre-caching containers while provisioning. Instead of minutes (for large containers) down to seconds
Recommended session https://myignite.techcommunity.microsoft.com/sessions/82058
I was personally surprised by so many cool news. Well done, Microsoft, well done!
Leave a Reply