Recently released: Azure Data Architecture Guide (ADAG) – 36 Articles for Data Professionals

The Azure Data Architecture Guide:

The guide is structured around a basic pivot: The distinction between relational data and non-relational data.

Relational data is generally stored in a traditional RDBMS or a data warehouse. It has a pre-defined schema (“schema on write”) with a set of constraints to maintain referential integrity. Most relational databases use Structured Query Language (SQL) for querying. Solutions that use relational databases include online transaction processing (OLTP) and online analytical processing (OLAP).

Non-relational data is any data that does not use the relational model found in traditional RDBMS systems. This may include key-value data, JSON data, graph data, time series data, and other data types. The term NoSQL refers to databases that are designed to hold various types of non-relational data. However, the term is not entirely accurate, because many non-relational data stores support SQL compatible queries. Non-relational data and NoSQL databases often come up in discussions of big data solutions. A big data architecture is designed to handle the ingestion, processing, and analysis of data that is too large or complex for traditional database systems.

Within each of these two main categories, the Data Architecture Guide contains the following sections:

  • Concepts. Overview articles that introduce the main concepts you need to understand when working with this type of data.
  • Scenarios. A representative set of data scenarios, including a discussion of the relevant Azure services and the appropriate architecture for the scenario.
  • Technology choices. Detailed comparisons of various data technologies available on Azure, including open source options. Within each category, we describe the key selection criteria and a capability matrix, to help you choose the right technology for your scenario.

This guide is not intended to teach you data science or database theory — you can find entire books on those subjects. Instead, the goal is to help you select the right data architecture or data pipeline for your scenario, and then select the Azure services and technologies that best fit your requirements. If you already have an architecture in mind, you can skip directly to the technology choices.

Traditional RDBMS

Concepts

Scenarios

Big data and NoSQL

Concepts

Scenarios

Cross-cutting concerns

Technical Reference Implementation for Enterprise BI and Reporting…

Technical Reference Implementation for Enterprise BI and Reporting

Azure offers a rich data and analytics platform for customers and ISVs seeking to build scalable BI and reporting solutions. However, customers face pragmatic challenges in building the right infrastructure for enterprise-grade production systems. They have to evaluate the various products for security, scale, performance and geo-availability requirements. They have to understand service features and their interoperability, and they must plan to address any perceived gaps using custom software. This takes time, effort, and many times, the end-to-end system architecture they design is sub-optimal.

ref: https://github.com/Azure/azure-arch-enterprise-bi-and-reporting/blob/master/README.md

The BitLocker encryption key cannot be obtained from the Trusted Platform Module (TPM)… Unable to do a clean install on HP Spectre x360 (2017) with UEFI enabled.

Since there seemed to be limited answers out there I decided to blog about a system installation issue I ran into…

Ran into a few problems recently after deciding to upgrade the SSD in a new HP Spectre x360. The supplied drive was a 512GB, and was upgrading it to 1TB, as 512GB is a little on the low side for the work I do.

Having taken backups and replaced the old drive, I proceeded to install from a standard ISO pushed onto a USB.

The default settings (UEFI enabled) refused to see the standard ISO for Windows 10 Pro, so I had to switch it to legacy mode (which disables secure boot), within the BIOS. This allowed the USB to be detected and install to go smoothly.

Cut to install completed, and system running smoothly… for my work I must enable BitLocker and encrypt my drive… going through the options the verification check fails

Error - BitLocker could not be enabled.

The BitLocker encryption key cannot be obtained from the Trusted Platform Module (TPM).

I can force enable BitLocker but TPM will not function properly and I have to enter the decryption key every time I start the computer.

UEFI is still disabled.

”TPM.msc” (through start menu) and “get-tpm” (through an admin PowerShell) confirm that TPM is enabled but operating with reduced functionality and not ready for full use.

A quick check seems to indicate that TPM 1.2 is OK with legacy boot mode, but TPM 2.0 (as in my new system) requires UEFI to be enabled, along with secure boot for TPM to fully function.

Enabling UEFI obviously fails to recognise the drive, since it was installed with legacy mode which installs it with MBR (master boot record), as opposed to the UEFI requirement of GPT (GUID partition table).

Incidentally with TPM operating in a diminished mode, Hyper-V cannot use TPM and will fail on any encrypted VMs (also a requirement for me).

Checking the HP Support site; their only recommendation is to pay them for the HP install media, which will install their version of the OS along with all their utilities and bloatware… erm no thanks!


Some light research showed that I could create my own UEFI boot media using Rufus… Rufus is an open source stand alone EXE that you can run locally – full details can b found here – https://github.com/pbatard/rufus/wiki/FAQ and can be downloaded from http://rufus.akeo.ie/downloads/

Rufus

However, having already completed the installation, wanted to avoid this if I could.


… what to do …

Well, Microsoft to the rescue. The latest version of Windows 10 now includes a new tool, which allows a MBR install to be converted to GPT with one line from Command Prompt… the tool has additional abilities also.

The following command run from an elevated (administrator) command prompt will allow you to convert the current disk to GPT.

C:\WINDOWS\system32\mbr2gpt.exe /convert /allowFullOS

MBR2GPT

MBR2GPT.EXE converts a disk from the Master Boot Record (MBR) to the GUID Partition Table (GPT) partition style without modifying or deleting data on the disk. The tool is designed to be run from a Windows Preinstallation Environment (Windows PE) command prompt, but can also be run from the full Windows 10 operating system (OS) by using the /allowFullOS option.

After conversion is completed (for me it only took a few seconds), you need to reboot and change your BIOS settings to re-enable/enable UEFI along with secure boot.

Full details of MBR2GPT may be found here – https://docs.microsoft.com/en-us/windows/deployment/mbr-to-gpt


You should then find that TPM is functional again…

… however if still having issues… you should clear and prepare TPM by i) opening up TPM.msc, ii) “Clear TPM”, iii) reboot, iv) open TPM.msc again and then v) choose “Prepare the TPM”.

TPM.MSC


After jumping through a few hoops I was able to successfully encrypt my drive and then enable TPM encryption within Hyper-V.

BitLocker Enabled

BitLocker Enabled – note: you can open TPM.msc from start menu (by typing TPM.msc) or from BitLocker window (“TPM Administrator”).

Our solution included in Microsoft Ignite 2017 Keynote

Azure Cosmos DB, Azure DW, Machine Leaning, Deep Learning, Neural Networks, TensorFlow, SQL Server, ASP.NET Core… are just a few of the components that make up one of the solutions we are currently developing.

Have been under a social media embargo, until today, but now that the Microsoft Ignite 2017 keynote has taken place, I am able to proudly say that the solution our team has been working on for some time was part of the Keynote addresses.

During the second keynote lead by Scott Guthrie, Danielle Dean a Data Scientist Lead discussed at a high level, one of the solutions we are developing at Jabil, which involves advanced image recognition of circuit board issues. The keynote focused in on the context of the solutions data science portion and introduced the new Azure Machine Learning Workbench to the packed audience.

Tomorrow morning there is a session – “Using big data, the cloud, and AI to enable intelligence at scale” (Tuesday, September 26, from 9:00 AM to 10:15 AM, in Hyatt Regency Windermere X)… during which we will be going into a bit more detail, and the guys at Microsoft will be expanding on the new AI and Big Data machine learning capabilities (session details via this link).

Visual Studio 2017 version 15.3 Release Notes

Release Date: August 18, 2017 – Visual Studio 2017 version 15.3.1

Issues Fixed in August 18, 2017 Release

These are the customer-reported issues addressed in this version:


Summary: What’s New in this Release

  • Accessibility Improvements make Visual Studio more accessible than ever.
  • Azure Function Tools are included in the Azure development workload. You can develop Azure Function applications locally and publish directly to Azure.
  • You can now build applications in Visual Studio 2017 that run on Azure Stack and government clouds, like Azure in China.
  • We improved .NET Core development support for .NET Core 2.0, and Windows Nano Server containers.
  • In Visual Studio IDE, we improved Sign In and Identity, the start page, Lightweight Solution Load, and setup CLI. We also improved refactoring, code generation and Quick Actions.
  • The Visual Studio Editor has better accessibility due to the new ‘Blue (Extra Contrast)’ theme and improved screen reader support.
  • We improved the Debugger and diagnostics experience. This includes Point and Click to Set Next Statement. We’ve also refreshed all nested values in variable window, and made Open Folder debugging improvements.
  • Xamarin has a new standalone editor for editing app entitlements.
  • The Open Folder and CMake Tooling experience is updated. You can now use CMake 3.8.
  • We made improvements to the IntelliSense engine, and to the project and the code wizards for C++ Language Services.
  • Visual C++ Toolset supports command-prompt initialization targeting.
  • We added the ability to use C# 7.1 Language features.
  • You can install TypeScript versions independent of Visual Studio updates.
  • We added support for Node 8 debugging.
  • NuGet has added support for new TFMs (netcoreapp2.0, netstandard2.0, Tizen), Semantic Versioning 2.0.0, and MSBuild integration of NuGet warnings and errors.
  • Visual Studio now offers .NET Framework 4.7 development tools to supported platforms with 4.7 runtime included.
  • We added clusters of related events to the search query results in the Application Insights Search tool.
  • We improved syntax support for SQL Server 2016 in Redgate SQL Search.
  • We enabled support for Microsoft Graph APIs in Connected Services.

Read more at https://www.visualstudio.com/en-gb/news/releasenotes/vs2017-relnotes#15.3.26730.08

 

.NET Core 2.0 and ASP.NET Core 2.0 Released

Been busy past couple of weeks, but if like me you are catching up… on 14th Aug, Microsoft released .NET Core 2.0, including ASP.NET Core 2.0

.NET Core 2.0

.NET and C# – Get Started in 10 Minutes

ASP.NET Core 2.0

This release features compatibility with .NET Core 2.0, tooling support in Visual Studio 2017 version 15.3, and the new Razor Pages user-interface design paradigm.  For a full list of updates, you can read the release notes and you can check the list of changed items in the ASP.NET Announcements GitHub repository for a list of changes from previous versions of ASP.NET Core.  The latest SDK and tools can be downloaded from https://dot.net/core.

Read more at https://blogs.msdn.microsoft.com/webdev/2017/08/14/announcing-asp-net-core-2-0/