How to Work With Very Large Data Sets Using Excel - TurboFuture - Technology
Updated date:

How to Work With Very Large Data Sets Using Excel

Kevin is data science and data engineer. He works for a large consulting company in Montreal, Canada. He has over 20 years experience.

working-with-big-data-in-excel-and-azure

Explore the techniques to handle very large datasets in Excel from Azure. Use any of Microsoft Azure storage options like HDInsight (Hive), CosmoDB, Azure Storage or MongoDB in Azure with Azure Analysis Services from Excel and Power Query.

Introduction

Let’s face it, at some point your data-sets will outgrow Excel’s million row capacity, even Power Query (Power Pivot or Get and Transform) 100 million row capacity. Even your computer has a limited amount of storage space and processing space.

Even working with a 100 million rows with Power Query will have a drag on your local resources or at least depending on the available resources on your local computer.

Microsoft offers several storage options for very large data-sets, like Big Data, that can work with Excel. You only need Azure Analysis Service (which is like the SQL Server Analysis Services). Mind you, if your SQL Server database is installed on premise or in a cloud virtual machine or private virtual cloud, you can also use SQL Server Service Analysis Services to handle very large data sets as well.

Architecture

Data Storage

You will need Excel 2007 or later with Power Pivot (2007/2013) or Get & Transform (2016 and later). Both technologies are the same except for branding. Also, in 2007/2013 time-frame, Power Pivot was a standalone add-in, while with Excel 2016 (O365: certain versions) or on-premise Excel 2016, the Get & Transform is integrated.

You will also need a storage option in Azure. There are several options to choose from:

Azure Databricks (which is the commercial version of Apache Spark and includes Apache Hive, MLlib and Apache Arrow)

Azure HDInsight (which is Microsoft’s version of Hadoop, based on Hortonworks Hadoop)

Azure Storage is massive file system which can handle very large flat files for example or blobs which are massive object data stores

Azure CosmosDB is a NoSQL Distributed Database

You can use Azure SQL Server (on-premise, in private cloud, in Azure VM, or Azure SQL Services) which are all SQL Server technology but hosted differently.

Azure MongoDB, this is MongoDB, a NoSQL massive data store that is naively hosted through Azure and that can be distributed globally and handle massive amounts of data.

All these data solutions can be easily setup through the Azure portal at: portal.azure.com

working-with-big-data-in-excel-and-azure
working-with-big-data-in-excel-and-azure
working-with-big-data-in-excel-and-azure
working-with-big-data-in-excel-and-azure

Middleware

In order to handle the massive data stored in these data storage solutions which are designed to scale massively and globally, you will need a middleware interface to bridge the communication between these massive data stores and Excel. Of course, you can’t download and try manipulating this data locally. You need to interface with the data remotely. Yes, this can be done using Azure Analysis Services or directly through a metadata connection directly to the source. The metadata connection allows you to manipulate the data in Excel, but the data is in the back-end data servers.

Azure Analysis Services is the equivalent of SQL Server Analysis Services. The former is for the databases in Azure and the latter is for SQL Server data warehouses on premise, on a Azure hosted VM or even a VM on AWS for instance.

Azure Analysis Services allows you to create a Tabular data model which is exactly the same as Power Pivot in Excel and Power BI. Once you connect to AAS (Azure Analysis Services), you will create a connection and metadata in Excel

To create a data model, you will need to install Visual Studio 2019 (Fig 4) (there is a free community edition). During installation, select Data storage and processing and in particular, the SQL Server Data Tools.

Fig 4

Fig 4

Once installed, launch Visual Studio and create an Analysis Services project from the list of project templates.

Fig 5

Fig 5

Once you must project created, you will need to connect to your data source and select the table and/or views you will need to build your model.

Fig 6

Fig 6

Fig 7

Fig 7

Fig 8

Fig 8

Excel

In Excel, you have several options to connect either directly to the source and build your model in Excel, like with Power Pivot, or you can connect to Analysis Services as I mentioned before. Either way, select the data source by clicking on “Get Data” in the “Get and Transform Data” group under the Data tab.

Fig 9

Fig 9

From the Get Data menu option, select “From Azure” and select one of the connection options from the list depending on the type of Big Data storage option you chose for the back-end storage in your architecture.

Fig 10

Fig 10

To use the Azure Analysis Services, select this option from the “Get Data” menu and select “From Database” menu select the “From Analysis Services” option. These options create a connection and transfers the metadata to Excel, but the data remains on the server.

These are very powerful data analysis and data engineering/science options.

Fig 12

Fig 12

Conclusion

Wrangling Big Data or very large data-sets in Excel is very possible using Azure's various massive storage options. With Excel you can opt to use Azure Analysis Services to design and build your data model and connect it with Excel or you can create a direct connection with these massive data storage like Hive, Spark, MongoDB, CosmosDB, SQL Server to name a few that are available with a subscription to Microsoft Azure public cloud.

This article is accurate and true to the best of the author’s knowledge. Content is for informational or entertainment purposes only and does not substitute for personal counsel or professional advice in business, financial, legal, or technical matters.

© 2020 Kevin Languedoc

Comments

Kevin Languedoc (author) from Canada on September 16, 2020:

Hi Dale

I'm glad I am able to provide useful help through my articles.

Kevin

Dale Anderson from The High Seas on September 15, 2020:

Another great article to help folk like me. Keep them coming please!

Kevin Languedoc (author) from Canada on April 08, 2020:

Thanks

Umesh Chandra Bhatt from Kharghar, Navi Mumbai, India on April 07, 2020:

Excellent article. Useful for people handling large data sets.

Related Articles