Since there seemed to be limited answers out there I decided to blog about a system installation issue I ran into…
Ran into a few problems recently after deciding to upgrade the SSD in a new HP Spectre x360. The supplied drive was a 512GB, and was upgrading it to 1TB, as 512GB is a little on the low side for the work I do.
Having taken backups and replaced the old drive, I proceeded to install from a standard ISO pushed onto a USB.
The default settings (UEFI enabled) refused to see the standard ISO for Windows 10 Pro, so I had to switch it to legacy mode (which disables secure boot), within the BIOS. This allowed the USB to be detected and install to go smoothly.
Cut to install completed, and system running smoothly… for my work I must enable BitLocker and encrypt my drive… going through the options the verification check fails…
I can force enable BitLocker but TPM will not function properly and I have to enter the decryption key every time I start the computer.
UEFI is still disabled.
”TPM.msc” (through start menu) and “get-tpm” (through an admin PowerShell) confirm that TPM is enabled but operating with reduced functionality and not ready for full use.
A quick check seems to indicate that TPM 1.2 is OK with legacy boot mode, but TPM 2.0 (as in my new system) requires UEFI to be enabled, along with secure boot for TPM to fully function.
Enabling UEFI obviously fails to recognise the drive, since it was installed with legacy mode which installs it with MBR (master boot record), as opposed to the UEFI requirement of GPT (GUID partition table).
Incidentally with TPM operating in a diminished mode, Hyper-V cannot use TPM and will fail on any encrypted VMs (also a requirement for me).
Checking the HP Support site; their only recommendation is to pay them for the HP install media, which will install their version of the OS along with all their utilities and bloatware… erm no thanks!
Some light research showed that I could create my own UEFI boot media using Rufus… Rufus is an open source stand alone EXE that you can run locally – full details can b found here – https://github.com/pbatard/rufus/wiki/FAQ and can be downloaded from http://rufus.akeo.ie/downloads/
However, having already completed the installation, wanted to avoid this if I could.
… what to do …
Well, Microsoft to the rescue. The latest version of Windows 10 now includes a new tool, which allows a MBR install to be converted to GPT with one line from Command Prompt… the tool has additional abilities also.
The following command run from an elevated (administrator) command prompt will allow you to convert the current disk to GPT.
C:\WINDOWS\system32\mbr2gpt.exe /convert /allowFullOS
After conversion is completed (for me it only took a few seconds), you need to reboot and change your BIOS settings to re-enable/enable UEFI along with secure boot.
Full details of MBR2GPT may be found here – https://docs.microsoft.com/en-us/windows/deployment/mbr-to-gpt
You should then find that TPM is functional again…
… however if still having issues… you should clear and prepare TPM by i) opening up TPM.msc, ii) “Clear TPM”, iii) reboot, iv) open TPM.msc again and then v) choose “Prepare the TPM”.
After jumping through a few hoops I was able to successfully encrypt my drive and then enable TPM encryption within Hyper-V.
Azure Cosmos DB, Azure DW, Machine Leaning, Deep Learning, Neural Networks, TensorFlow, SQL Server, ASP.NET Core… are just a few of the components that make up one of the solutions we are currently developing.
Have been under a social media embargo, until today, but now that the Microsoft Ignite 2017 keynote has taken place, I am able to proudly say that the solution our team has been working on for some time was part of the Keynote addresses.
During the second keynote lead by Scott Guthrie, Danielle Dean a Data Scientist Lead @Microsoft discussed at a high level, one of the solutions we are developing at Jabil, which involves advanced image recognition of circuit board issues. The keynote focused in on the context of the solutions data science portion and introduced the new Azure Machine Learning Workbench to the packed audience.
Tomorrow morning there is a session – “Using big data, the cloud, and AI to enable intelligence at scale” (Tuesday, September 26, from 9:00 AM to 10:15 AM, in Hyatt Regency Windermere X)… during which we will be going into a bit more detail, and the guys at Microsoft will be expanding on the new AI and Big Data machine learning capabilities (session details via this link).
Some cool stuff ahead today… keynote coming up…
Release Date: August 18, 2017 – Visual Studio 2017 version 15.3.1
Issues Fixed in August 18, 2017 Release
These are the customer-reported issues addressed in this version:
- Update Git version to address security fix.
- Add Watch displays the wrong line of code.
- F# Editor loses focus when typing arrow, backspace, or newline keys.
- R Tools missing translations.
Summary: What’s New in this Release
- Accessibility Improvements make Visual Studio more accessible than ever.
- Azure Function Tools are included in the Azure development workload. You can develop Azure Function applications locally and publish directly to Azure.
- You can now build applications in Visual Studio 2017 that run on Azure Stack and government clouds, like Azure in China.
- We improved .NET Core development support for .NET Core 2.0, and Windows Nano Server containers.
- In Visual Studio IDE, we improved Sign In and Identity, the start page, Lightweight Solution Load, and setup CLI. We also improved refactoring, code generation and Quick Actions.
- The Visual Studio Editor has better accessibility due to the new ‘Blue (Extra Contrast)’ theme and improved screen reader support.
- We improved the Debugger and diagnostics experience. This includes Point and Click to Set Next Statement. We’ve also refreshed all nested values in variable window, and made Open Folder debugging improvements.
- Xamarin has a new standalone editor for editing app entitlements.
- The Open Folder and CMake Tooling experience is updated. You can now use CMake 3.8.
- We made improvements to the IntelliSense engine, and to the project and the code wizards for C++ Language Services.
- Visual C++ Toolset supports command-prompt initialization targeting.
- We added the ability to use C# 7.1 Language features.
- You can install TypeScript versions independent of Visual Studio updates.
- We added support for Node 8 debugging.
- NuGet has added support for new TFMs (netcoreapp2.0, netstandard2.0, Tizen), Semantic Versioning 2.0.0, and MSBuild integration of NuGet warnings and errors.
- Visual Studio now offers .NET Framework 4.7 development tools to supported platforms with 4.7 runtime included.
- We added clusters of related events to the search query results in the Application Insights Search tool.
- We improved syntax support for SQL Server 2016 in Redgate SQL Search.
- We enabled support for Microsoft Graph APIs in Connected Services.
Been busy past couple of weeks, but if like me you are catching up… on 14th Aug, Microsoft released .NET Core 2.0, including ASP.NET Core 2.0
.NET Core 2.0
.NET and C# – Get Started in 10 Minutes
ASP.NET Core 2.0
This release features compatibility with .NET Core 2.0, tooling support in Visual Studio 2017 version 15.3, and the new Razor Pages user-interface design paradigm. For a full list of updates, you can read the release notes and you can check the list of changed items in the ASP.NET Announcements GitHub repository for a list of changes from previous versions of ASP.NET Core. The latest SDK and tools can be downloaded from https://dot.net/core.
Cosmos DB Change Feed Processor NuGet package now available
Many database systems have features allowing change data capture or mirroring, for use with live backups, reporting, data warehousing and real time analytics for transactional systems… Azure Cosmos DB has such a feature called the Change Feed API, which was first introduced in May 2017.
The Change Feed API provides a list of new and updated documents in a partition in the order in which the updates were made.
Microsoft has just recently introduced the new Change Feed Processor Library which abstracts the existing Change Feed API to facilitate the distribution of change feed event processing across multiple consumers.
The Change Feed Processor library provides a thread-safe, multiple-process, runtime environment with checkpoint and partition lease management for change feed operations.
The Change Feed Processor Library is available as a NuGet package for .NET development. The library makes actions like these easier to read changes from a change feed across multiple partitions and performing computational actions triggered by the change feed in parallel (aka Complex Event Processing).
Judy Shen from the Microsoft Cosmos DB team has published some sample code on GitHub, demonstrating it’s use.
Working with the change feed support in Azure Cosmos DB
Aravind Ramachandran, Mimi Gentz and Judy Shen also just published an article Working with the change feed support in Azure Cosmos DB on the Azure docs site a few days ago…
Azure Cosmos DB is a fast and flexible globally replicated database service that is used for storing high-volume transactional and operational data with predictable single-digit millisecond latency for reads and writes. This makes it well-suited for IoT, gaming, retail, and operational logging applications. A common design pattern in these applications is to track changes made to Azure Cosmos DB data, and update materialized views, perform real-time analytics, archive data to cold storage, and trigger notifications on certain events based on these changes. The change feed support in Azure Cosmos DB enables you to build efficient and scalable solutions for each of these patterns.
With change feed support, Azure Cosmos DB provides a sorted list of documents within an Azure Cosmos DB collection in the order in which they were modified. This feed can be used to listen for modifications to data within the collection and perform actions such as:
- Trigger a call to an API when a document is inserted or modified
- Perform real-time (stream) processing on updates
- Synchronize data with a cache, search engine, or data warehouse
Changes in Azure Cosmos DB are persisted and can be processed asynchronously, and distributed across one or more consumers for parallel processing. Let’s look at the APIs for change feed and how you can use them to build scalable real-time applications. This article shows how to work with Azure Cosmos DB change feed and the DocumentDB API.
Change feed support is only provided for the DocumentDB API at this time; the Graph API and Table API are not currently supported.
Use cases and scenarios
Change feed allows for efficient processing of large datasets with a high volume of writes, and offers an alternative to querying entire datasets to identify what has changed. For example, you can perform the following tasks efficiently:
- Update a cache, search index, or a data warehouse with data stored in Azure Cosmos DB.
- Implement application-level data tiering and archival, that is, store “hot data” in Azure Cosmos DB, and age out “cold data” to Azure Blob Storage or Azure Data Lake Store.
- Implement batch analytics on data using Apache Hadoop.
- Implement lambda pipelines on Azure with Azure Cosmos DB. Azure Cosmos DB provides a scalable database solution that can handle both ingestion and query, and implement lambda architectures with low TCO.
- Perform zero down-time migrations to another Azure Cosmos DB account with a different partitioning scheme.
Lambda Pipelines with Azure Cosmos DB for ingestion and query
You can use Azure Cosmos DB to receive and store event data from devices, sensors, infrastructure, and applications, and process these events in real-time with Azure Stream Analytics, Apache Storm, or Apache Spark.
Within web and mobile apps, you can track events such as changes to your customer’s profile, preferences, or location to trigger certain actions like sending push notifications to their devices using Azure Functions or App Services. If you’re using Azure Cosmos DB to build a game, you can, for example, use change feed to implement real-time leaderboards based on scores from completed games.