Five Ways to Unleash the Power of a Marketing Bot

Five Ways to Unleash the Power of a Marketing Bot

From advertising and promotion to selling and feedback, marketing chatbots are reducing the cost for companies, saving time, increasing efficiency and creating larger awareness for brands.

Here are five reasons why chatbots are a perfect companion for marketing :

1. Personalized Advertising

Chatbots get rid of the biggest hurdle that advertising faces — unwanted promotional emails.

In a 2017 survey conducted by Adobe Digital Insights, a little more than two-thirds of respondents preferred personalized ads. Personalized chatbots ensure a high level of accuracy for personalizing offerings according to customers’ needs.

Established brands like eBay and 1–800-Flowers were early adopters of personalized marketing system but with IBM Watson, more and more companies are adopting similar strategies.

Rocket Fuel, a predictive marketing software company relies on Watson which allows brands to gain a competitive advantage through better performance, improved brand safety, and deeper intelligence. The collaboration uses natural language understanding (NLP) techniques to analyze the sentiment of stories that are collected from multiple sources. It then breaks down the sentiment of breaking news headlines and responds accordingly. This allows brands to reduce the risk of negative news article appearing on the same page as the brands.

The outdoor clothing retailer, North Face employs a chatbot to help people shop for a perfect jacket. Powered by Watson, the shopping assistant engages in a quick Q&A and allows Watson to figure out what’s needed.

Bank of America created a bot, Erica that could be used across a variety of platforms, such as Facebook Messenger and SMS text messaging. The AI bot helped customers make smarter decisions, based on their spending.

2. Prompt Customer Service

A well-strategized marketing chatbot instantly responds to customer requirements (without being intrusive). But there is more to chatbots that make them perfect for providing customer service.

For one, they provide quick and frictionless solutions to customers’ problems. Live support Chatbots allow interactions in real time and are also personalized. Nanorep is a service chatbot that harnesses AI to combine with Natural language processing (NLP) technology for customer service and eCommerce. IKEA, Vodafone, and FedEx are a few customers that employ Nanorep to deliver effortless and simple self-service solutions to their customers.

In 2016, Macy’s mobile app powered by IBM Watson was tested chatbot to improve customer experience and get instant feedback. The company aims to efficiently answer customer queries, specific to the individual store location (so far only for 10 such locations).

Interestingly, IBM Watson also developed the pilot of ‘gift concierge’ for 1–800-Flowers.com.


3. Ease of Transactions

By integrating with messaging platforms and voice assistants, brands are creating an engaging platform which automates the transaction process.

In 2016, eBay ShopBot integrated with popular messaging platform Messenger to increase its reach. Messenger remains the most popular social media platform with more than 1 billion per month. Netatmo, a smart home also launched its Facebook Messenger Bot which lets users operate their smart home appliances, heaters, lighting etc. simply through text commands. Pizza Hut launched its own chatbot version by integrating them on social media platforms to receive food orders for delivery. Through Facebook and Twitter, a customer can place orders.

Voice activation and NLP technologies like Siri, Alexa, Cortana and Google Assistant may be taking such orders to another level. BMW, will already start incorporating Alexa in its cars beginning in 2018 enabling drivers to toggle radio stations, obtain directions and get news through voice commands. Last year, online retailer eBay teamed with Google Assistant app to order through the smart speaker, transforming the entire shopping experience. 

4. Good Sales Performance

Chatbots make a great salesman through online assistance to customers. The personalized feature shows customers only those options that they are interested in buying.

Kik is already selling products on messaging platform. The Canadian based app is the official chatbot for H&M, Europe’s largest apparel retailer. Kik sounds more of a human than a bot and gives feedback on outfits like a friend would. After item selection, the chatbot creates outfits for any occasion. On confirming that it looks good, one can order and pay for it online.

Working also in the department of sales — sells event tickets using FB Messenger. It wants to make the entire experience more personal and less mechanical. StubHub lets users give feedback, which acts as a database and is used to make the chatbot smarter.

5. Surveys and Customer Feedback

For a brand to evolve, genuine feedback can be valuable. Such responses help designing future business strategies but only if customers’ responses have high completion rates. Most often, customers leave surveys unfilled half way through but chatbots shredding the tedious surveys and encouraging more users to respond.

Wizu is a perfect example. It is an AI-powered chatbot that understands sentiments, adapting to conversations in real-time depending based purely on customer responses. Currently, it offers an engaging integration with Salesforce.

Hubert is a survey chatbot that replaces feedback surveys that teachers used to improve their teaching. Hubert uses AI and machine learning to chat with students on digital platforms and interpret the data, which then helps teachers understand what improvement is needed.

Polly helps in conducting surveys on Slack. On Slack, a user can create surveys/polls, target an audience and get real-time results. The data collected from the surveys can then be viewed in detail by simply expanding the page.

But interestingly even with all its benefits, the process to create a perfect marketing chatbot continues.

For many, emotion-sensing chatbots may provide the answer. If chatbots start to detect and respond to emotions, then it could fully unleash the power of a marketing chatbot.

Originally Published in here 

Follow on LI : Deena Zaidi


Five Ways to Unleash the Power of a Marketing Bot

Lessons from 3 Big Data Breaches of 2017

Lessons from 3 Big Data Breaches of 2017

In 2017, cyber-attacks invaded the privacy of millions, grabbing headlines as established firms struggled to keep the noise down.

Interestingly, these breaches had a common theme — they were reported weeks after their initial discovery.

Here are three of the many security breaches of 2017 that impacted millions due to reasons including low security, weak passwords or avoidable negligence.

Equifax

The breach of the Atlanta-based credit reporting agency, Equifax is one of the worst data breaches recorded in the history of cybercrimes. The heist involved highly sensitive information of consumer data that included personal details of 143 million customers, their social security numbers, credit card accounts and 11 million US drivers’ licenses.
Many aggrieved customers are facing the repercussions of the breach that include “multiple fraudulent charges” on credit cards and “unauthorized mortgage loans” on associated accounts. Additionally, driver licenses information is paving the way for impersonators to impact websites and services that accept driver’s license as a requirement.
But it is Equifax’s negligence and delays in addressing the issue that deteriorated matters further for the firm. The hackers had retrieved the data between May and July 2017 but it took the company six weeks to report the breach.
After uncovering the breach, Equifax’s attempt to address concerns through a data breach tool was also a failure and its support site resembled just another phishing site. The site directed the consumers to sign up for Equifax’s credit monitoring product TrustID.
Considered to be the “mother of all hacks”,the agency was eventually forced to retract clause on the site that prevented the affected customers from suing the company.

Yahoo — Verizon

Reported in August 2016, Yahoo’s hack originally traces back to 2014.

Initially, the breach impacted at least 500 million user accounts. But over a period of time, Yahoo’s breach took complicated turns, with deeper revelations. By the end of 2016, the company opened up on another hack which impacted one billion users.After four months of Yahoo-Verizon merger in mid-June, Yahoo’s bidding firm Verizon made a startling announcement. In a recent update in October 2017, it was revealed that each and every Yahoo account was impacted and this was close to 3 billion.
The disclosure made Yahoo a home to the largest data breach in history, raising serious concerns on the stolen data.
In its statement, Verizon said, “The company recently obtained new intelligence and now believes, following an investigation with the assistance of outside forensic experts, that all Yahoo user accounts were affected by the August 2013 theft.”
But on July 13, 2017, Verizon too found itself amidst a hack that impacted 14 million subscribers.
The consumers who had called customer services in the past six months of the July hack were exposed to the data breach. ZDNet reported that the data of Verizon customers were available for download after an employee at an Israel-based firm, Nice Systems left them on an unsecured Amazon S3 storage server. The security lapse was discovered by the Director of cyber risk research at security firm UPGuard, Chris Vickery in June of 2017.
Such breach exposed records of customers’ information which included their names, cell phone number and account PIN.

Uber

When talking about breaches, Uber’s wait-watch-disclose policy provides a learning lesson for many startup hypes on why a business model needs to be an ethical one.On November 21, 2017, Uber announced that personal data of as many 50 million riders and 7 million Uber drivers in the US was stolen in a breach that dated back to October 2016. The stolen information (as reported by CNBC) can allow hackers to find their homes and even their travel history.
The company had not revealed any information about the hacker or how it paid him the money. To worsen matters, Uber paid the hacker $100,000 to delete the data. According to Reuters, Uber’s newly appointed CEO Dara Khosrowshahi said that the disclosure to regulators should have been made at the time it was discovered.
After the disclosure of the security breach by Uber, two of the company’s top security officials were dismissed. The breach raised some serious concerns about the ethical and privacy policy of the ride-hailing app.
Why are data breaches getting common?
Poor cybersecurity practices can give way to cybercrimes. For established firms like Deloitte that pride themselves on Cyber Intelligence Centre, data breaches can be embarrassing. In March 2017, accounting giant Deloitte fell victim to a cyber-attack. The Guardian reported that “a host of clients had material that was made vulnerable by the hack” which included The US departments of state, energy, homeland security and defense, the US Postal Service, the housing giants that fund and guarantee mortgages in the US, Fannie Mae and Freddie Mac. A recent Data Breach Investigations Report (DBIR) by Verizon showed that 70% of the breaches are financially motivated and 80% of the hacking-related breaches were either due to stolen passwords and/or weak passwords.

Outdated Security Technology Any security breach stems from lapses in the privacy practices of the firm. At the time of the hack, Deloitte did not have multi-factor authentication, which allowed outside hackers to get into the system through the administrator’s account.

Outside hacking can be malicious and the cost of such attacks is costlier when compared to data breaches through system glitches and human errors. Extraction of personal data by hackers should be addressed immediately without delay before it spreads across the entire customer base and hackers cover their tracks. As cyber-attacks become common, it is important for CEOs of firms address the issue of cybersecurity diligently and create a protected work environment. In an interview with CNBC, McAfee CEO Chris Young said CEOs must enforce a “culture of security”.

Delay in Report and Response An IBM report shows that “faster the breach can be identified and contained, the lower the costs”. But real-time detection remains distant for many companies but a grave concern is when reports of data breaches are kept under wraps for a long time.

Reporting data breach years after it has been committed only worsens the situation for both the consumers and the firm’s goodwill. Late disclosures erode the trust in organizations.

Uber and Yahoo are examples of how companies should not delay the disclosure. More than anything immediate efforts should be made to fix the issue since delaying it only aggravates the existing issue. Some of the biggest breaches like that of Yahoo took years to be revealed. Yahoo in its two consecutive hacks reported in 2016 revealed that they actually dated back to as early as 2013 and 2014. Yahoo’s recent update in 2017 worsened the issue when it was revealed that the breach actually impacted 3 billion accounts instead of 1 billion reported in 2016.

According to latest findings by digital security provider Gemalto, in the first half of 2017, almost 2 billion data records around the world were either lost or stolen through cyber-attacks. A cyber-attack is a serious crime and its impact can be devastating to the security of people of any country. In the coming times, protecting sensitive information from hacks is highly crucial and in the light of growing cyber-attacks, people should keep a close eye on fraudulent activities like bank account activities, revelations of credit card information and phishing scams.


Lessons from 3 Big Data Breaches of 2017

Banking with Artificial Intelligence

Banking with Artificial Intelligence

Faced with unprecedented challenges, banks have started racing to embrace AI to gain a competitive advantage.

With the advent of chatbots, personal assistants, and robo-advisors, it may not be too hard to imagine that the next wave of technology could revolutionize the traditional style of banking.
An Accenture report recently indicated that within the next three years, banks will deploy Artificial Intelligence (A.I.) as their primary method to interact with customers.
In early 2016, Swedish-speaking Amelia became the first non-English deployment of IPsoft’s AI platform at SEB, one of Sweden’s largest bank. The bank adopted “digital employee” Amelia to integrate into its front-office. The cognitive agent solves problems just like humans “but in a fraction of the time”, interacts just like humans and even senses emotions.
But Amelia is just one aspect of what the future of banking may look like.
As estimated by Gartner, by 2020, customers will manage 85 percent of their association with a business with no human interaction. With changing customer needs and a growing generation of millennials, a bigger challenge for banks is the fierce competition from tech-savvy firms like Apple, Amazon, Facebook, and Google.
According to a report by Citi in 2016, 30% of banking jobs (close to 2 million jobs) are under threat across US and Europe over the next decade.
Tech giants are offering financial services that are both popular and a preferred method by most millennials — a large demographic of the economy. The Millennial Disruption Survey points out that nearly half of the millennials surveyed are counting on tech start-ups to overhaul the way banks work and as many as 73% millennials would be more excited for new financial services offered by Google, Amazon, Apple, PayPal or Square than from their own nationwide bank.
Faced with unprecedented challenges, banks have started racing to embrace AI to gain a competitive advantage. The AI adoption in banks is spreading across different operations some of which are highlighted below:

Fraud Detection

By replacing old statistical approaches, traditional banks are adopting cognitive computer technology to detect early fraud detection. This will prevent theft of personal information and funds. To use AI technology to detect fraudulent phone calls, Lloyd’s Banking Group recently partnered with US-based startup Pindrop.

The patented Phoneprinting™ technology is Pindrop’s innovative software that identifies 147 different features of a human voice from one call. It creates an audio fingerprint of that caller and looks for unusual activity, potential fraud so as to trace criminal callers.

Bank of America will be launching its virtual assistant, Erica, later this year that will be integrated into the mobile banking app to continue to help clients improve their financial lives.

Customer Services

Besides fraud detection, banks are also adopting AI to personalize services and provide real-time solutions to client-specific needs.

In 2013, ANZ Bank in Australia, was among the first banks to explore the possibilities of AI. It began using IBM’s Watson to help its financial advisors understand their clients. The bank now plans on extending its cognitive computing into areas like advisory, risk, and back office automation. Leading investment bank Goldman Sachs invested in an AI-based financial research platform Kensho in 2014.

The analytics platform can instantly respond to complex situations, query millions of documents and can be questioned using natural language. In February 2017, Wells Fargo created an AI team to provide more personalized services to its customers and strengthen its digital offerings.

Financial Advisory

Banks have started deploying AI for other services as well including wealth management and financial advisory.

Bank of America will be launching its virtual assistant, Erica, later this year that will be integrated into the mobile banking app to continue to help clients improve their financial lives.
In 2014, Bloomberg reported that Swiss Bank, UBS picked Singapore-based Sqreem to identify behavioral patterns of individuals and show potential match-ups with different types of wealth management products for each individual.

Legal Work

In March 2017, JPMorgan hired a new team altogether to automate its legal work. The program, named Contract Intelligence or COIN, automates hours of reading, including interpretation of commercial loan agreements which, until it went online late last year, consumed over 360,000 hours of work each year by lawyers and loan officers. Now it’s done in seconds and with fewer errors by a system that never sleeps.

As AI floods the banking channels, many have raised concerns about the future of traditional banking jobs that face “the automation risk”.

According to a report by Citi in 2016, 30% of banking jobs (close to 2 million jobs) are under threat across US and Europe over the next decade. With a decline in office branches, the number of US branch tellers have already declined by 15% since 2007. However, many are adamant that AI in banking will not be a threat to the banking industry.

Last year, ANZ CTO Patrick Maes told iTnews in an interview,

This is just taking the monotone and repetitive tasks we have created through complexity in the IT landscape out, where we basically have humans becoming the integration between two systems; that is what will be replaced.

This story originally appeared on here


Banking with Artificial Intelligence

Here is How Big Data is changing the Oil Industry

Here is How Big Data is changing the Oil Industry

In 2006, marketing commentator Michael Palmer had blogged, “Data is just like crude. It’s valuable, but if unrefined it cannot really be used.

After nine years, the statement still holds true across any industry that depends on large volumes of data. It is true that until and unless, data is not broken down into pieces and analyzed, it holds little value.

As the world becomes more receptive to the advantages of big data, the oil industry does not seem to be far behind. If the huge amount of data is just stored, then it has little worth and so, for it to be useful, it has to be identified, aggregated, stored, analyzed and perfected. The ability to access and draw rich insights from large datasets can make the oil industry more profitable and efficient. A successful oil company will quickly forecast the potential information and keep costs low to actualize its success without losing any discrepancy in the evaluation of the dataset.

Both oil movements and popularity of big data have gradually created a stir over a period of time. Changes in supply and demand of oil have long been related to fluctuations in oil prices. With falling oil prices, oil and gas industry is slowly finding its way towards big data, in order to manage and reduce risk, thereby increasing the overall revenue of a company. Oil prices globally are becoming competitive and as oil-producing economies fight for gaining global market share in oil, big data analytics can help them in identifying areas that require significant improvement.

Benefits of Adopting Big Data in The Oil Industry

According to Mark P. Mills, a senior fellow at the Manhattan Institute, “Bringing analytics to bear on the complexities of shale geology, geophysics, stimulation, and operations to optimize the production process would potentially double the number of effective stages, thereby doubling output per well and cutting the cost of oil in half.”

A tech-driven oil field is already expected to tap into 125 billion barrels of oil and this trend may affect the 20,000 companies that are associated with the oil business. Hence, in order to gain competitive advantage, almost all of them will require data analytics to integrate technology throughout the oil and gas lifecycle.

1. In Real-time and Highly Cost Effective

Data volume in the oil industry grows with rapid speed and handling a large amount of data efficiently becomes very important. Oil companies have always been generating extreme volumes of data at a very high rate on a daily basis. Traditionally, large volumes of data can be very expensive for both oil and gas producers. Such huge costs can significantly impact the financial performance of the company.

With the use of big data, companies can not only cut costs but also capture large data in real time. Such use of analytics can help in improving production by 6%-8%. However, the role of big data in the industry of oil and gas goes beyond efficiency and analyzing large volumes of data in real time. Near-real-time visualization, storage of large data sets and near real-time alerts are considered the most important advantages in big data analytics.

2. Reduction of Risk and Better Decision Making

Geographically speaking, layers of rocks vary across regions, even though they may be similar structurally. Lessons usually learned from one area can be applied to similar areas.  Traditionally, unstructured data is stored in different databases or any storage facility, which requires a lot of time and effort. Data science can help in reducing risk and help in learning more about each subsystem thereby increasing the accuracy in decision-making.

3. High Accuracy in Drilling Methods and Oil Exploration

Since oil depends on drilling and oil field exploration, any use of big data analytics in this field is considered a boon. Miller writes, “Big-data analytics can already optimize the subsurface mapping of the best drilling locations; indicate how and where to steer the drill bit; determine, section by section, the best way to stimulate the shale; and ensure precise truck and rail operations.”

The search for new hydrocarbon deposits demands a huge amount of materials, manpower, and logistics. With drilling a deepwater oil well often costing over $100 million, no one wants to be looking in the wrong place. To avoid this issue, Shell uses fiber optic cables (created in a special partnership with Hewlett-Packard for these sensors), and the data is then transferred to its private servers, maintained by Amazon Web Services (AWS). This gives a far more accurate idea to engineers of what lies beneath and saves a considerable amount of time and effort.

New oil drilling locations and new ways to stimulate shale oil are only some of the benefits of applying big-data analytics in the oil industry. Seismic software, data visualization, and pervasive computing devices are some of the modern analytical tools that are currently being adopted by the oil firms.

4. Ensures efficient performance of machines

Oil drilling is a continuous process and machines have to work for long hours under severe temperatures and conditions. Big data is used to ensure that machines are working properly and are not damaged due to breakdowns or failures. Machines are fitted with sensors that collect data about its performance. This data is then compared to the aggregated data ensuring that parts are replaced in an efficient manner and downtime is minimized, further reducing additional expenses.

In a recent survey by Accenture and Microsoft, oil companies and those in the support industries established that 86% to 90% of respondents said that an increase in their analytic capabilities, use of mobile technologies in the field and banking more on Industrial Internet of Things would increase the value of their business. According to the survey, over the next 3-5 years, investment in big data and automation are expected to increase from 56% to 61% and 53% to 65%, respectively. Finding and producing more hydrocarbons, at lower costs in economically sound and environmentally friendly ways can not only add value to the data but also helps in accurate decision-making.

Considerations while applying big data tools in the oil industry

The popularity of big data across various industries has gained momentum with increasing amount of data awareness. Good analytics help managers become more proficient in managing different kinds of data. Such variety of data may include seismic, drilling logs, operational parameters such as drill bit RPMs and weight on bit, frack performance data and production rates. With each function producing vast and variable data, the right data needs to go to the right hands, so as to optimize performance.

Just like other industries, the oil and gas industry needs to understand how big data can be optimally used and what applications are possible. Since not all data is valuable, knowledge of storing relevant information becomes important. As the equation in global oil supply and demand shifts, more and more statistics needs to be mapped. Locations and techniques also need restructuring on a frequent basis and this requires professionals who not only know how to use big data efficiently but also can draw value out of it. The challenge also is about efficiency in the data process (sifting out the important from what is not).

The hired experts need to know when a technology upgrade is required since the oil and gas industry is based on ever-fluctuating demand and supply. They should understand open-source models, cloud technologies, pervasive computing and iterative development methodologies. Shell has about 70 people working full-time in the data analysis department along with hundreds more spread over the world participating on an ad hoc basis.

The gradual transition towards big data implementation may not be easy for many oil companies since many lack the manpower and capabilities for hiring the required personnel that can handle big data. Only about 4% of companies across industries have the talent and skills they need to draw tangible business value from analytics.

Personal and cyber security also need attention since this remains a perceived barrier in realizing the value of big data analytics. Big data real-time analytics surely presents innovative opportunities to establish more efficient oil production, cost and risk reduction, safety improvement, more regulatory compliance and better decision-making. Good expertise and strategic prudence while using big data tools, will not only ensure success but also reduce the margin of error.


Here is How Big Data is changing the Oil Industry

Bridging the Skills Gap of New York City ~ A Case Study using Tableau and MySQL

Bridging the Skills Gap of New York City ~ A Case Study using Tableau and MySQL

Data Analytics remains incomplete without data visualization.

 

In a Data Analytics course, I understood how Tableau was a useful tool so as to create and explain a visual story that heavily relied on big data. As a student, I was given x-cases which required data retrieval, cleaning, manipulation, and analysis so as to make appropriate recommendations.

 

To start with, I have a double major in finance and international banking and had some knowledge of data analysis. The features of Tableau were not only quick to understand but also simple to use. The interesting journey that quickly graduated from Excel to R to SQL and finally to Tableau was a great experience and after my course, I was excited to share a project that heavily relied on Tableau.

 

I divided the project into three sections

  1. The Question
  2. The Answer
    • The Process
    • The Findings
  3. The Limitations
  4. The Conclusion

 

I first looked at the problem and understood what the data was lacking and tried to retrieve and create an entity-relationship diagram in My SQL Workbench. Since the focus was NYC-MSA region and later was to trim it to only NYC, I had to ensure that the data I sifted only reflected the problem in question. My findings are based on the data and the way it has been sorted and cleaned. As per the retrieved data, I noticed a mismatch in supply and demand of skills across states. My Tableau worksheet also looked at the average posting duration across 11 states which I kept to 35 days.

 

Summarising below is the graph made in Tableau which looked at the average duration(in days) and compared it to the percentage of job postings across ALL states, for which data was provided.

Jobs Posting Across all states.png

 

THE QUESTION

 

A recent Brookings study of the Burning Glass data found that nationally, the median duration of advertising for a STEM vacancy is more than twice as long as for a non-STEM vacancy. The case revolves around the ongoing debate of education and a leading labor market analytics firm – Burning Glass. The case aimed to –

The case aimed to –

  • To evaluate NYC skills gap
  • Provide data-driven recommendations to NYC skills coalition  (NYCSC)
  • Help in the allocation of $100 million over ten years for development of workforce in the city of New York.

 

According to The National Federation of Independent Business:

As of first-quarter 2017, 45% of small businesses reported that they were unable to find qualified applicants to fill job openings.

THE ANSWER

 

In order to evaluate city’s skills, a detailed roadmap would require collaborative efforts by federal, state and private foundations. In order to answer this question, I first process the data and then look at suggestive measures based on the findings.

 

THE PROCESS

 

Before diving into the data set, I first try to understand the meaning of skills gap and why it exists. Simply put it is the difference between demand and supply of jobs available and the necessary skill set required in an ideal candidate. The gaps tend to sometimes exist due to various factors including insufficient jobs available and lack of proper skill set.

 

Slide6.JPG

 

The data is in CSV format and has different occupations, skills, counties. Duplicates are removed and N/A values are converted to NULL values.

 

Using MySql Workbench, an entity-relationship diagram (ERD) is created so as to sort and join data using primary and foreign keys.

 

BGT_class_ERD.png

 

THE FINDINGS

 

After creating the ERD, I look at the information limited to only New York City. The five counties in NYC are Bronx, Kings, New York, Queens, and Richmond. I use R to perform this task and with the available data, I create a heat map using Tableau. The data shows that the number of job postings is largest in NYC, I then, focus on the percentage of job postings across the tri-state of New York, New Jersey, and Pennsylvania to analyze where New York stands.

 

These findings in Tableau are as given below

Slide9.JPG

I then try to take a deep dive into how long a certain job posting takes to get filled. For this, I compare no. of Job Postings to average job posting duration by occupation (only for New York City) My findings show that New York City’s gap is primarily driven by a scarcity of workers with certain skills. The MSA region code is 234.

Slide10.JPG

The top jobs in NYC come from three industries namely: Healthcare, Information Technology, and Finance & Accounting. Using Table I try to create a Bubble Chart to look at the top jobs. (click the below image to enlarge)

Slide11.JPG

The Top jobs in NYC are then deeply analyzed and I attempt to do this by compiling the experience level, education, name of the certificate. Correspondingly, I look at the number of jobs and the average number of days since they have been posted and not filled. The findings are mentioned below

 

1. HEALTHCARE INDUSTRY

 

The healthcare industry, it seems has the highest demand for nursing managers and registered nurses and the certificates required for these are listed below. A high chance is that the training gaps could be due to the fact that the nursing managers and registered nurses do not have the necessary certificates that make them ineligible for the job posting.

 

(Click Below to Enlarge the Image)

Slide12.JPG

2. FINANCE INDUSTRY

 

Looking into the available data and simplifying the visuals through Tableau, my findings suggested that there is a huge demand for entry-level financial analysts with a background in accounting and for tax managers with 3-5 years of work experience. Looking closely, Certified Public Accountant and Certified Financial Analyst Course are popular choices for Financial Analysts looking to bridge the skills gap.

 

(Click Below to Enlarge the Image)

Slide13.JPG

3. INFORMATION TECHNOLOGY INDUSTRY

 

The third industry that could benefit from the funding is the information technology industry.

Through data sorting and with the help of Tableau, I was able to visualize the top job titles that could benefit are Business Intelligence Analyst, IT Project Manager, Software Developer/Engineer, and Systems Analyst. Such demand for these titles means two things: there is either a huge demand which is also being met as quickly or there is a lack of skills which may have prompted the hiring to be slow.

In case of IT industry, the latter seems to be true. Looking closely, it can be concluded that certifications like Series 7, Project Management Certification (PMP) etc will add value to the resume of the job titles that are high in demand. While this may be true for people in their mid or senior level, gaining such certificates at an early stage could put a deserving candidate at the top of the hiring list.

 

(Click Below to Enlarge the Image)

Slide14.JPG

But the million dollar question remains: what after a skills gap analysis and what are the suggestive measures that local authorities can take in order to bridge the gap and allow the employment rate to go up in the city of New York. After a thorough literature review, I was able to design a couple of steps as suggestive measures that could enable corporates, government as well as non-profit organizations to get more involved.

BGT-Final.jpg

denotes that pay competitive salaries may be one suggestive measure but the data used may/may not support that. The measure is based on extensive secondary literature available.

 

THE LIMITATIONS

Like most cases, the data provided suffers from some limitations which may/may not ascertain the above measures. One of the biggest limitations considered would be the year of data. Since there is no year provided, the findings could mean that it does not hold true for 2016 or 2017. Also while cleaning and sorting, there was no way to capture how many incumbents, unemployed or out of the labor force workers had requisite skills to fill the in-demand jobs.  Prudent decisions can be made when the unstructured data after cleaning contains the necessary components needed to form a strong analysis. Looking at the data, the salary was not taken into account since it was sparse and a lot was missing simply due to the missing dates. The recommendations could also vary since there is no real-time information or additional information that may have likely affected the data set.

 

Sometimes changes in administration allow new labor laws and regulations to kick in. The recommendations given through the data sent could also vary since there is no accountability for real-time information or additional information. I also looked at the entire city of New York City but am sure there is room for future analysis where looking at individual counties may show variation in the skills gap.

 

CONCLUSION

Slide18.JPG

 Originally Published on Tableau Community

REFERENCES

Bridging the Skills Gap of New York City ~ A Case Study using Tableau and MySQL