data analytics Tag Archives - General Assembly Blog

Using D3 Visualization for Data-Driven Interactive Motion Graphics

By

In the midst of the 2012 U.S. presidential election, The New York Times published a series of online articles that contained beautiful, interactive, data-driven graphics to illustrate changes in voter behavior over time and the candidates’ paths to winning the election. Created using a JavaScript library called D3 (for Data-Driven Documents), these data visualizations caused a lot of excitement among developers.

Until that point, these kinds of fastinteractive motion graphics based on large data sets hadn’t really been seen in websites by the general public. Developers recognized that with D3, they could easily create beautiful, data-driven graphics, in this case in collaboration with data scientists, journalists, and graphic designers.

But D3 was more than just another JavaScript library. It was the final link in a series of technological advancements that led to these kinds of graphic possibilities:

1. Large amounts of data became ubiquitous, and the speed at which it is transmitted and processed increased exponentially.

Since the mid 2000s, there has been a surge in the amount of data available to developers, the speed at which it can be processed, and the ways in which we conceptualize how to display it. At the heart of this explosion are advances in storage technology that have made it incredibly easy, cheap, and fast to store massive amounts of information. For example, in 2007 a 1GB hard drive cost about $60. Ten years later, the same amount can get you 1TB of storage. That’s 1,000 times more data for the same cost. Computer and internet connection speeds have dramatically increased as well.

As a result, the fields of artificial intelligence, big data, and data science have been able to mature in ways that were previously not possible. All of this means that the manner in which we think about, analyze, and visually present large data sets has drastically changed in a relatively short amount of time. In addition, people on their home computers now have the ability to quickly download large data sets — processors are 60 times faster than 10 years ago — and perform calculations to display them in interesting ways.

2. All major web browsers began to support Scalable Vector Graphics (SVG), a technology that draws images from mathematical equations.

Scalable Vector Graphics are the second piece of the puzzle because they allow for the creation of images from mathematical equations and code. A normal image, like a .jpg or a .gif, is made up of a series of colored dots, or pixels, like this:

Pixels

These are great for creating photos, but it’s almost impossible to link an image like this to data in such a way that the code is used to generate the image. Contrast that with the SVG for a circle and its code:

SVG code and circle

Note the cx=”50″ cy=”50″ r=”40″ code. This is what defines the circle’s position and size. Since it is code, data can be used to change these values. After all the major browsers allowed SVG code to be embedded within web pages, developers could use JavaScript to manipulate the images. The difficulty, though, was in converting data values to SVG code. This is the key part that D3 plays.

3. The D3 JavaScript library was created to tie together data and SVG.

At its core, D3 maps data values to visual values — but as simple as that sounds, it is incredibly powerful. Using D3, a developer no longer needs to worry about the math involved in converting, for example, the number of votes to the height of a rectangle in a bar graph. They could simply give the data to D3, which can analyze it to figure out the minimum and maximum values and tell D3 what the minimum and maximum height of the bars should be. Then D3 can generate all the bars in the graph by itself. D3 takes a lot of work involving tricky math and converts it into a few simple steps.

D3 has much more advanced functionality, too. It can generate common elements of graphs, such as axes and pie-chart segments. It can animate between different visual values, so, for example, complex line graphs smoothly morph as the data changes. There’s plenty more functionality, too. People are just beginning to scratch the surface of D3’s capabilities, and it’s really an exciting time to get involved with it.

D3.js at General Assembly

D3 is an incredibly easy library to use. Because of this, in General Assembly’s full-time Web Development Immersive (WDI) program, on campus and remotely, we often reserve it as an optional topic at the end of the course. WDI focuses on the fundamentals of programming, from front-end essentials like JavaScript, through back-end skills like Ruby on Rails and APIs. After having a thorough understanding of these competencies, learning D3 will come easily.

GA also offers occasional short-form workshops on D3 and other data visualization techniques, so developers and data scientists can begin to discover how to leverage programming to create data-driven stories.

Ask a Question About Our Data Analytics Programs

Meet Our Expert

Matt Huntington has worked as a developer for over 15 years and has a full understanding of all aspects of development (server side, client side, and mobile). Matt graduated magna cum laude from Vassar College with a degree in computer science. He teaches the full-time Web Development Immersive Remote course at General Assembly, and has worked for clients including Nike, IBM, Pfizer, MTV, Chanel, Verizon, Goldman Sachs, AARP, and BAM.

Matt Huntington, Web Development Immersive Remote Instructor

Designing a Dashboard in Tableau for Business Intelligence

By

Tableau is a data visualization platform that focuses on business intelligence. It has become very popular in recent years because of its flexibility and beautiful visualizations. Clients love the way Tableau presents data and how easy it makes performing analyses. It is one of my favorite analytical tools to work with.

A simple way to define a Tableau dashboard is as a glance view of a company’s key performance indicators, or KPIs. There are different kinds of dashboards available — it all depends on the business questions being asked and the end user. Is this for an operational team (like one at a distribution center) that needs to see the amount of orders by hour and if sales goals are achieved? Or is this for a CEO who would like to measure the productivity of different departments and products against forecast? The first case will require the data to be updated every 10 minutes, almost in real time. The second doesn’t require the same cadence, and once a day will be enough to track the company performance.

Over the past few years, I’ve built many dashboards for different types of users, including department heads, business analysts, and directors, and helped many mid-level managers with data analysis. Here are some best practices for creating Tableau dashboards I’ve learned throughout my career.

First Things First: Why Use a Data Visualization?

Visualizations are among the most effective ways to analyze data from any business process (sales, returns, purchase orders, warehouse operation, customer shopping behavior, etc.).

Below we have a grid report and bar chart that contain the same information. Which is easier to interpret?

Grid report

Bar Chart
Grid report vs. bar chart.

That’s right — it’s quicker to identify the category with the lowest sales, Tops, using the chart.

Many companies used to use grid reports to operate and make decisions, and many departments still do today, especially in retail. I once went to a trading meeting on a Monday morning where team members printed pages of Excel reports with rows and rows of sales and stock data by product and took them to a meeting room with a ruler and a highlighter to analyze sales trends. Some of these reports took at least two hours to prepare and required combining data from different data sources with VLOOKUPs — a function that allows users to search through columns in Excel. After the meeting, they threw the papers away (what a waste of paper and ink!) and then the following Monday it all started again.

Wouldn’t it be better to have a reporting tool in which the company’s KPIs were updated on a daily basis and presented in an interactive dashboard that could be viewed on tablets/laptops and digitally sliced and diced? That’s where tools like Tableau dashboards come in. You can drill down into details and answer questions raised in the meeting in real time — something you couldn’t do with paper copies.

How to Design a Dashboard in Tableau

Step 1: Identify who will use the dashboard and with what frequency.

Tableau dashboards can be used for many different purposes and therefore will be designed differently for each circumstance. This means that, before you can begin designing a dashboard, you need to know who is going to use it and how often.

Step 2: Define your topic.

The stakeholder (i.e., director, sales manager, CEO, business analyst, buyer) should be able to tell you what kind of business questions need to be answered and the decisions that will be made based on the dashboard.

Here, I am going to use data from a fictional retail company to report on monthly sales.

The commercial director would like to know 1) the countries to which the company’s products have been shipped, 2) which categories are performing well, and 3) sales by product. The option of browsing products is a plus, so the dashboard should include as much detail as possible.

Step 3: Initially, make sure you have all of the necessary data available to answer the questions specified.

Clarify how often you will get the data, the format in which you will receive the data (inside a database or in loose files), the cleanliness of the data, and if there are any data quality issues. You need to evaluate all of this before you promise a delivery date.

Step 4: Create your dashboard.

When it comes to dashboard design, it’s best practice to present data from top to bottom. The story should go from left to right, like a comic book, where you start at the top left and finish at the bottom right.

Let’s start by adding the data set to Tableau. For this demo, the data is contained in an Excel file generated by a software I developed myself. It’s all dummy data.

To connect to an Excel file from Tableau, select “Excel” from the Connect menu. The tables are on separate Excel sheets, so we’re going to use Tableau to join them, as shown in the image below. Once the tables are joined, go to the bottom and select Sheet 1 to create your first visualization.

Excel Sheet in Tableau
Joining Excel sheet in Tableau.

We have two columns in the Order Details table: Quantity and Unit Price. The sales amount is Quantity x Unit Price, so we’re going to create the new metric, “Sales Amount”. Right-click on the measures and select Create > Calculated Field.

Creating a Map in Tableau

We can use maps to visualize data with a geographical component and compare values across geographical regions. To answer our first question — “Which countries the company’s products have been shipped to?” — we’ll create a map view of sales by country.

1. Add Ship Country to the rows and Sales Amount to the columns.

2. Change the view to a map.

Map
Visualizing data across geographical regions.

3. Add Sales Amount to the color pane. Darker colors mean higher sales amounts aggregated by country.

4. You can choose to make the size of the bubbles proportional to the Sales Amount. To do this, drag the Sales Amount measure to the Size area.

5. Finally, rename the sheet “Sales by Country”.

Creating a Bar Chart in Tableau

Now, let’s visualize the second request, “Which categories are performing well?” We’ll need to create a second sheet. The best way to analyze this data is with bar charts, as they are to compare data across categories. Pie charts work in a similar way, but in this case we have too many categories (more than four) so they wouldn’t be effective.

1. To create a bar chart, add Category Name to the rows and Sales Amount to the columns.

2. Change the visualization to a bar chart.

3. Switch columns and rows, sort it by descending order, and show the values so users can see the exact value that the size of the rectangle represents.

4. Drag the category name to “Color”.

5. Now, rename the sheet to “Sales by Category”.

Sales category bar chart
Our Sales by Category breakdown.

Assembling a Dashboard in Tableau

Finally, the commercial director would like to see the details of the products sold by each category.

Our last page will be the product detail page. Add Product Name and Image to the rows and Sales Amount to the columns. Rename the sheet as “Products”.

We are now ready to create our first dashboard! Rearrange the chart on the dashboard so that it appears similar to the example below. To display the images, drag the Web Page object next to the Products grid.

Dashboard Assembly
Assembling our dashboard.

Additional Actions in Tableau

Now, we’re going to add some actions on the dashboard such that, when we click on a country, we’ll see both the categories of products and a list of individual products sold.

1. Go to Dashboard > Actions.

2. Add Action > Filter.

3. Our “Sales by Country” chart is going to filter Sales by Category and Products.

4. Add a second action. Sales by Category will filter Products.

5. Add a third action, this time selecting URL.

6. Select Products, <Image> on URL, and click on the Test Link to test the image’s URL.

What we have now is an interactive dashboard with a worldwide sales view. To analyze a specific country, we click on the corresponding bubble on the map and Sales by Category will be filtered to what was sold in that country.

When we select a category, we can see the list of products sold for that category. And, when we hover on a product, we can see an image of it.

In just a few steps, we have created a simple dashboard from which any head of department would benefit.

Dashboard
The final product.

Dashboards in Tableau at General Assembly

In GA’s Data Analytics course, students get hands-on training with the versatile Tableau platform. Create dashboards to solve real-world problems in 1-week, accelerated or 10-week, part-time course formats — on campus and online. You can also get a taste in our interactive classes and workshops.

Ask a Question About Our Data Programs

Meet Our Expert

Samanta Dal Pont is a business intelligence and data analytics expert in retail, eCommerce, and online media. With an educational background in software engineer and statistics, her great passion is transforming businesses to make the most of their data. Responsible for the analytics, reporting, and visualization in a global organization, Samanta has been an instructor for Data Analytics courses and SQL bootcamps at General Assembly London since 2016.

Samanta Dal Pont, Data Analytics Instructor, General Assembly London

Data Storytelling: 3 Objectives to Accomplish With Visualizations

By

Storytelling is as old as humanity itself. The words, the cadence, the visuals — whether seen with our eyes or in our minds — quickly capture us, engaging every area of our brains as we listen intently, anticipate logically, and become entangled emotionally. From the podcasts we listen to during our morning commute, to office gossip, to Thursday night Must See TV, we are awash in stories. And we love them so much that we heap money and accolades upon our best storytellers, from singers, authors, and movie directors to the social media stars sharing the visual story of their lives. So, it should be no surprise that we desire a good story with our data as well.

What Is Data Storytelling?

Data is a snapshot of measurable information that details what has happened at some point in time. Examples of data in a business context may include measurable events, such as amount of sales achieved, the number of social media impressions captured, or the duration of rental bike rides during weekdays and weekends. Data storytelling allows you, the business professional, to explain why these events have occurred, what may happen next, and what business decisions can be made with this newly acquired knowledge. Effective data storytelling, in the form of presentations that combine context, details, and visual illustrations, accomplishes three main objectives:

  • Builds a credible narrative around an analysis of data.
  • Connects a series of insights with a smooth flow of information.
  • Concludes the narrative with a compelling call to action.

Data Storytelling in Business: Inside Luxury Retail

Being skilled at data storytelling is critical in every business, and I’ve seen it hard at work specifically in the world of luxury retail, the industry I’ve been lucky enough to analyze and explore. Data is created with every physical interaction in a store and every click on a screen, and decisions must keep pace of this constant stream of information. On top of that, almost every retail initiative requires collaboration between many people and teams, including finance, merchandising, marketing, stores, visual, and logistics, to name a few. By presenting a clear flow of analysis that all teams and stakeholders can follow with an actionable conclusion, I’m able to motivate my team and drive results.

By completing an analysis of style selling by price point, I may discover that my customers buy more handbags that cost $500 than handbags that cost $1,000. This might be the opposite of what my buyers and sales associates expected. Based on this price-sensitivity insight, my new strategy might be to increase the number of handbags priced below $500 and decrease the number of handbags priced above $1,000. In order to get my team to support this change in action, I frame my data story with more impactful visual displays for the buyers and more satisfying store experiences for the sales associates. By presenting the data with my audience in mind, I can communicate effectively with all team members, especially those who may not be motivated by numbers or percentages but instead understand a story about meeting customer needs and delivering high-quality service.

Data storytelling is utilized across many industries and topics, borrowing from many story arches that we already know and love. In the TED Talk “The Math Behind Basketball’s Wildest Moves,” Rajiv Maheswaran employs the excitement and pace of a basketball game highlight reel to tell the data story of movement in our everyday lives. He grabs us from the start with high-tech visuals that turn pro-basketball plays into “moving” data “dots.” From these data points, he decodes the patterns found in game-winning plays with the excitement of a sportscaster.
By the end of the 12-minute presentation, he has not only convinced us of the importance of tracking movement to create more basketball wins, but that our movements — as regular people at work, at home, and beyond — can generate insights that create more wins for us in the game of life.

Data Storytelling at General Assembly

In General Assembly’s data-focused courses, students practice converting analysis results into compelling stories that drive business solutions using real-world data sets. In our part-time Data Analytics course, for example, students analyze open data from companies like Mozilla Firefox and Airbnb and use one of several storyboard frameworks to guide the arc of their data story. Students can also dive into the essentials of data storytelling in a self-paced, mentor-guided Data Analysis course, as well as part- and full-time Data Science programs.

GA’s project-based approach provides three key benefits:

  • Each presentation gives students an opportunity to test different storytelling frameworks, helping them learn what works best for different data situations, as well as what fits their personal style.
  • Hearing other students present from the same data set allows students to see how different approaches lead to different insights and different levels of effectiveness.
  • Receiving immediate presentation feedback from their instructor, instructional associate, and peers allows students to greatly improve their presentation skills within a short amount of time.
Ask a Question About Our Data Programs

Meet Our Expert

Alissa Livingston is a financial planning manager at Saks Fifth Avenue. When she’s not traveling to Paris or Milan to negotiate with the luxury brands we all covet, she’s training for half marathons, rain or shine. Alissa received a bachelor’s degree in mechanical engineering from Northwestern University in Chicago and an MBA from Columbia Business School in the City of New York. She’s an instructor for GA’s part-time Data Analytics course and weekend SQL Bootcamp.

Alissa Livingston, Data Analytics Instructor, General Assembly New York

Databases: The Fundamentals of Organizing and Linking Data

By

All around the globe, people are constantly tweeting, Googling, booking airline tickets, and banking online, among hundreds of other everyday internet activities. Each of these actions creates pieces of data — and all of these have to live somewhere. That’s where databases — put simply, a collection of data — come in.

Let’s look at LinkedIn as an example of how databases are used. When you first sign up for an account, you create a username and password. These are typically stored in some sort of database — usually one that’s encrypted in order to protect users’ privacy.

Once you’ve created an account, you can start updating your profile, sharing links to articles, and commenting on connections’ posts.

Here’s what happens when you interact with LinkedIn.

These links and comments eventually end up in a database. The main idea is that anyone with the proper permissions can then manage (search, see, comment, share, or like) these elements. All of these actions are usually performed by a piece of software that manages the database.

How Does a Database Work?

Let’s start by focusing on the first part of the word “database”: data. “Data” refers to some unstructured collection of known information.

For example, take a LinkedIn user named Joe whose email address is joe@someemail.com. Right now, we know two things about him: his name and his email address. These are two pieces of data.

Next, we need to organize related pieces of data. This is usually done through a structured format, such as a table. A table is composed of columns (also known as fields) and rows (also referred to as records).

Below, we see that our Joe data are now organized in a table called “Person”. Here, we have a record of Joe’s information: His name is in one field, his email is in another field, and we assign Joe a number (in a third field) for easy reference.

Person
person_number first_name email
1 Joe joe@someemail.com

As you might expect, in any database there can be many tables — one per related data collection. Simplifying our LinkedIn example, we might have a “Person” table, an “Education” table, and a “Comment” table as we collect more data points about an user and their activities.

Now, these tables can (optionally) be linked together to form some sort of relationship between them. For example, Joe may have listed the schools he attended, which could be represented by a relationship between the “Person” and “Education” tables. Thanks to this relationship, we know which schools in the “Education” table are Joe’s.

Usually, this step is when pieces of structured and related data are translated into information.

Any organization can have multiple databases — one for sales information, one for payroll information, and so on. To maintain these, they often turn to a type of software known as a database management system, or DBMS. There are many types of DBMS to choose from, including Oracle, Microsoft SQL Server, MySQL, and Postgres.

The database itself is housed in a piece of hardware — a physical machine that either resides on a company’s premises or is rented offsite through providers like Amazon Web Services, Google Cloud Platform, or Microsoft Azure Solution.

Last but not least, the data contained in the database needs to be accessible through some sort of admin tool or programming language. Analysts typically use a set of digital tools — including Microsoft Excel, IBM Cognos Analytics, pgAdmin, the R language, and Tableau — to examine this data for patterns and trends.

Data analysts can then use these patterns and trends to make informed decisions.

For example, if you’re a data analyst at a large company, you may be tasked with helping management determine a price for a new product. One approach you could take is looking at how much the product costs to produce — how much of people’s time and effort, as well as machinery, is needed to make and maintain the product. Let’s say you do that by analyzing the data sets of payroll and procurement and come up with a cost of $30. Then you’ll look at how much customers are willing to pay, and perhaps another data set can inform you that similar companies have charged up to $50 for a similar product.

But you can also see that the price might have a seasonal trend, meaning people buy more of this product in, say, December, than during the rest of the year. A data analyst could use any of the above-mentioned data tools to visualize these three data sets — production cost, competitors’ costs, and seasonal purchasing trends — to recommend to that the best price for the new product is $40.

When and Why Are Databases Used?

There are different types of databases and management solutions for different types of problems. Here are just a few reasons why you may need a database and what solutions you may choose in each situation:

  • Storing, processing, and searching large amounts of information: If you’re working for a company like Facebook that manages half a million comments every minute, a database could be used as a place to source reporting/analytics or run machine learning algorithms. The solution may be some sort of distributed data storage and processing framework, like Apache Hadoop or Spark.
  • Building a mobile app: If you’re creating an app, you’ll want to choose a database that is small and efficient for mobile devices, like SQLite or Couchbase.
  • Working at a startup or medium-sized business: If you’re on a tight budget or want a database that’s widely documented and used, then look to open-source database management systems like MySQL, PostgreSQL, MongoDB, or MariaDB.

Databases at General Assembly

At General Assembly, we empower students with the data tools and techniques needed to use data to answer business questions. In our full- and part-time data courses, students use databases to perform data analysis using real-world data.

In our part-time on-campus Data Analytics course or online Data Analysis program, students learn the fundamentals of data analysis and leverage data tools like Excel, PostgreSQL, pgAdmin, SQL, and Tableau. In our part-time Data Science course, students discover different types of databases, learn how to pull data from them, and more, and in the career-changing Data Science Immersive, they gather, store, and organize data with SQL, Git, and UNIX, while learning the skills to launch a career in data.

Ask a Question About Our Data Programs

Meet Our Expert

Gus Lopez is a tech lead with more than 20 years of experience in delivering IT projects around the world, including to many Fortune 500 companies. He teaches the part-time Data Analytics course at General Assembly’s Melbourne campus. Achievements include constantly delivering technically challenging back-end, front-end, and data science projects as well as managing multidisciplinary teams. Gus has a master’s degree in computer science from RICE University, an MBA from Melbourne Business School, and a Ph.D. with summa cum laude distinction in data science from Universidad Central de Venezuela. He is passionate about analyzing and searching for insights from data to improve processes and create competitive advantage for organizations.

Gus Lopez, Data Analytics Instructor, General Assembly Melbourne

Excel: Building the Foundation for Understanding Data Analytics

By

If learning data analytics is like trying to ride a bike, then learning Excel is like having a good set of training wheels. Although some people may want to jump right ahead without them, they’ll end up with fewer bruises and a smoother journey if they begin practicing with them on. Indeed, Excel provides an excellent foundation for understanding data analytics.

What exactly is data analytics? It’s more than just simply “crunching numbers,” for one. Data analytics is the art of analyzing and communicating insights from data in order to influence decision-making.

In the age of increasingly sophisticated analytical tools like Python and R, some seasoned analytics professionals may scoff at Excel, which was first released by Microsoft in 1987, as nothing more than petty spreadsheet software. Unfortunately, most people only touch the tip of the iceberg when it comes to fully leveraging this ubiquitous program’s power as a stepping stone into analytics.

Using Excel for Data Analysis: Management, Cleaning, Aggregation, and More

I refer to Excel as the gateway into analytics. Once you’ve learned the platform inside and out, throughout your data analytics journey you’ll continually say to yourself, “I used to do this in Excel. How do I do it in X or Y?” In today’s digital age, it may seem like there are new analytical tools and software packages coming out every day. As a result, many roles in data analytics today require an understanding of how to leverage and continuously learn multiple tools and packages across various platforms. Thankfully, learning Excel and its fundamentals will provide a strong bedrock of knowledge that you’ll find yourself frequently referring back to when learning newer, more sophisticated programs.

Excel is a robust tool that provides foundational knowledge for performing tasks such as:

  • Database management. Understanding the architecture of any data set is one of first steps of the data analytics workflow. In Excel, each worksheet can be thought of as a table in a database. Each row in a worksheet can then be considered a record while each column can be considered an attribute. As you continue to work with multiple worksheets and tables in Excel, you’ll learn that functions such as “VLOOKUP” and “INDEXMATCH” are similar to the “JOIN” clauses seen in SQL.
  • Data cleaning. Cleaning data is often one of the most crucial and time-intensive components of the data analytics workflow. Excel can be used to clean a data set using various string functions such as “TRIM”, “MID”, or “SUBSTITUTE”. Many of these functions cut across various programs and will look familiar when you learn similar functions in SQL and Tableau.
  • Data aggregation. Once the data’s been cleaned, you’ll need to summarize and compile it. Excel’s aggregation functions such as “COUNT”, “SUM”, “MIN”, or “MAX” can be used to summarize the data. Furthermore, Excel’s Pivot Tables can be leveraged to aggregate and filter data quickly and efficiently. As you continue to manipulate and aggregate data, you’ll begin to understand the underlying SQL queries behind each Pivot Table.
  • Statistics. Descriptive statistics and inferential statistics can be applied through Excel’s functions and add-ons to better understand our data. Descriptive statistics such as the “AVERAGE”, “MEDIAN”, or “STDEV” functions tell us about the central tendency and variability of our data. Additionally, inferential statistics such as correlation and regression can help to identify meaningful patterns in the data which can be further analyzed to make predictions and forecasts.
  • Dashboarding and visualization. One of the final steps of the data analytics workflow involves telling a story with your data. The combination of Excel’s Pivot Tables, Pivot Charts, and slicers offer the underlying tools and flexibility to construct dynamic dashboards with visualizations to convey your story to your audience. As you build dashboards in Excel, you’ll begin to uncover how the Pivot Table fields in Excel are the common denominator in almost any visualization software and are no different than the “Shelfs” used in Tableau to create visualizations.

If you want to jump into Excel but don’t have a data set to work with, why not analyze your own personal data? You could leverage Excel to keep track of your monthly budget and create a dashboard to see what your spending trends look like over time. Or if you have a fitness tracker, you could export the data from the device and create a dashboard to show your progress over time and identify any trends or areas for improvement. The best way to jump into Excel is to use data that’s personal and relevant — so your own health or finances can be a great start.

Excel at General Assembly

In GA’s part-time Data Analytics course and online Data Analysis course, Excel is the starting point for leveraging other analytical tools such as SQL and Tableau. Throughout the course, you’ll continually have “data déjà vu” as you tell yourself, “Oh this looks familiar.” Students will understand why Excel is considered a jack-of-all-trades by providing a great foundation in database management, statistics, and dashboard creation. However, as the saying goes, “A jack-of-all-trades is a master of none.” As such, students will also recognize the limitations of Excel and the point at which tools like SQL and Tableau offer greater functionality.

At GA, we use Excel to clean and analyze data from sources like the U.S. Census and Airbnb to formulate data-driven business decisions. During final capstone projects, students are encouraged to use data from their own line of work to leverage the skills they’ve learned. We partner with students to ensure that they are able to connect the dots along the way and “excel” in their data analytics journey.

Having a foundation in Excel will also benefit students in GA’s full-time Data Science Immersive program as they learn to leverage Python, machine learning, visualizations, and beyond, and those in our part-time Data Science course, who learn skills like statistics, data modeling, and natural language processing. GA also offers day-long Excel bootcamps across our campuses, during which students learn how to simplify complex tasks including math functions, data organization, formatting, and more.

Ask a Question About Our Data Programs

Meet Our Expert

Mathu A. Kumarasamy is a self-proclaimed analytics evangelist and aspiring data scientist. A believer in the saying that “data is the new oil,” Mathu leverages analytics to find, extract, refine, and distribute data in order to help clients make confident, evidence-based decisions. He is especially passionate about leveraging data analytics, technology, and insights from the field of behavioral economics to help establish a culture of evidence-based, value-driven health care in the United States. Mathu enjoys converting others into analytics geeks while teaching General Assembly’s part-time Data Analytics course in Atlanta.

Mathu A. Kumarasamy, Data Analytics Instructor, GA Atlanta

How Big Data Creates the Perfect Digital Marketing Applications

By

get-more-meaning-data

 

Social media is big data. Every 60 seconds, Facebook users share 2,460,000 pieces of content and Yelp receives 26,380 reviews. Don’t forget about the 2 million blog posts created each day and the 1 billion websites available for us to peruse.

With all of this content floating around the Internet, digital marketing struggles to truly engage and convert an increasingly fragmented online audience. Reliance on manual processes to seek out and engage with relevant social media posts is not enough. Therefore, there is a growing demand for applications that allow digital marketers to automatically understand the content shared about their brand, pinpoint the users to target, and market to them in a personalized way.

Continue reading

The Skills and Tools Every Data Scientist Must Master

By

women of color in tech

Photo by WOC in Tech.

“Data scientist” is one of today’s hottest jobs.

In fact, Glassdoor calls it the best job of 2017, with a median base salary of $110,000. This fact shouldn’t be big news. In 2011, McKinsey predicted there would be a shortage of 1.5 million managers and analysts “with the know-how to use the analysis of big data to make effective decisions.” Today, there are more than 38,000 data scientist positions listed on Glassdoor.com.

It makes perfect sense that this job is both new and popular, since every move you make online is actively creating data somewhere for something. Someone has to make sense of that data and discover trends in the data to see if the data is useful. That is the job of the data scientist. But how does the data scientist go about the job? Here are the three skills and three tools that every data scientist should master.

Continue reading

Announcing General Assembly’s New Data Science Immersive

By

DataImmersive_EmailArt_560x350_v1

Data science is “one of the hottest and best-paid professions in the U.S. More than ever, companies need analytical minds who can compile data, analyze it, and drive everything from marketing forecasts to product launches with compelling predictions. Their work drives the core strategies of modern business — so much so that, by 2018, data-related job openings will total 1.5 million. That’s why we’ve worked hard to develop classes, workshops, and courses to confront the data science skills gap. The latest addition to our proud family of data education is the new Data Science Immersive program.

Launching for the first time in San Francisco and Washington, D.C. on April 11, this full-time Immersive program will equip you with the tools and techniques you need to become a data pro in just 12 weeks.

Continue reading

5 High-Paying Careers That Require Data Analysis Skills

By

Data-Driven-UX-Design

The term “big data” is everywhere these days, and with good reason. More products than ever before are connected to the Internet: phones, music players, DVRs, TVs, watches, video cameras…you name it. Almost every new electronic device created today is connected to the Internet in some way for some purpose.

The result of all those things connected to the Internet is data. Big, big data. What’s that mean for you? Simply put, it means if you can quickly, accurately, and intelligently sift through data and find trends, you are extremely valuable in today’s tech job market. More specifically, here are five job titles that require data analytics expertise to get ahead. 

Continue reading

How to Get a Job In Data: A Livestream with Fast Company

By

YouTube-Thumbnail-FastCOmpany

We teamed up with Fast Company to host two of the leading minds in data, Claudia Perlich from Dstillery and Marc Maleh from R/GA, at our campus in New York City. Sarah Lawson, an assistant editor at Fast Company, moderated the discussion as they chatted about their everyday work with data, their favorite parts of the industry, and what it’s really like to work in data.

Continue reading