Showing posts with label big data. Show all posts
Showing posts with label big data. Show all posts

Thursday, December 1, 2016

Amazon goes all-in on AI and big data at AWS re:Invent 2016

Invent 2016
Invent 2016 AWS


On Wednesday, at the first keynote of the AWS re:Invent conference in Las Vegas, Amazon Web Services (AWS) CEO Andy Jassy took the stage to explain a host of new updates to the cloud provider's portfolio of services. And, it seems Amazon is making a big bet on next -generation technology.

Some of the biggest announcements were the first three services of the Amazon AI portfolio. For starters, Amazon Rekognition provides image recognition, categorization, and facial analysis in batch analysis or real time. The facial analysis can detect sentiment, and tell whether or not the subject is wearing glasses, for example.

Amazon Polly is a text-to-speech (TTS) service that is powered by deep learning. It takes a text input and returns an MP3 stream that is altered to sound more like actual conversation. For example, if the text contains "WA" the output might say Washington instead. Jassy also announced that a new service called Amazon Lex, which powers Alexa, is coming as well. Lex provides natural language understanding and automatic speech recognition.


The new AI tools announced by Jassy could make it much easier for enterprise customers to tap into, and leverage, technologies such as machine learning to build their next generation of applications.


In his address, Jassy also noted that Amazon would be launching a new analytics product called Amazon Athena. As a companion to the existing EMR and Redshift, Athena is an interactive query service that allows users to analyze data in S3 using SQL. This significantly lowers the bar for everyday IT to utilize big data analytics to glean insights.

To go along with these new services, Jassy also announced a slew of new compute instances and features as well. Here are the following compute types that were announced:
T2.xlarge - 16 GiB, 2 vCPU
T2.2xlarge - 32 GiB,. 2 vCPU
R4 - 48 GiB, DDR4, L3 cache, 64 vCPUs
I3 - 3.3 million IOPS, 488 GiB, 15.2 TB NVMe SSD, 64 vCPUS
C5 - 72 vCPUs, Intel Skylake, 12 Gbps to EBS, 144 GiB


The C5 instance shows just how much hardcore GPU processing is being done on AWS. However, to give a broader group of users access to GPU power, Jassy announced a new service called Elastic GPUs for EC2, which allows users to attach a GPU to any of the existing compute instances in AWS.


Continuing in the vein of simplifying some of the features available in AWS, Amazon Lightsail was revealed as a way to make virtual private servers (VPS) easier to launch. Users choose from five bundles, name their server, and create it. Additionally, packages start at only $5 a month.

AWS users will also get access to a preview of F1 instances, Amazon's new FPGA instance family. This will allow users to run custom logic on EC2.

Additionally, to touch on IoT deployments, Jassy also announced AWS Greengrass. This service embeds AWS Lambda compute and other AWS services in connected devices, and allows users to manage them from the AWS console.

Last year, AWS launched Snowball, a secure appliance that makes it easier to move data to the cloud. At the 2016 re:invent, Jassy unveiled the general availability of Snowball Edge, which has on-board compute, and more storage than the previous version.
The 3 big takeaways for TechRepublic readers


Amazon announced three new AI products for image recognition, text-to-speech, and the natural language understanding that powers Amazon Alexa.
Amazon also announced Amazon Athena, an analytics products that allows users to query S3 with simple SQL.
The majority of the compute instance lineup in AWS got an update, with new products as well.

Wednesday, November 30, 2016

Nissan to use big data to alert users about maintenance of the upcoming connected cars

Nissan to use big data to alert users about maintenance of the upcoming connected cars

Image: MirrorLink
Nissan Motor Co will make its first major foray into internet-connected cars by offering an option in some new vehicles that will use big data technology to notify drivers when vehicle maintenance is required. As automakers compete fiercely to develop self-driving cars and improve the customer experience inside vehicles, Japan’s second-largest car maker said on Tuesday it will begin rolling out the service in Japan and India in 2017, followed by other countries through 2020.
With the availability of new mobility options including ride-hailing and car-sharing services threatening to cool demand for individual car ownership, automakers are looking for new ways to attract loyal drivers. Toyota Motor Corp, Japan’s biggest car maker, announced earlier this month that it will have a similar alerting feature in the domestic version of the upcoming Prius plug-in model.
And Ford Motor Co last month announced that by year’s end, some of its models will be able to communicate with smart home devices using Amazon’s Alexa voice service. Nissan said that it would also market the device required to access the service, which can be retrofitted into existing models. In the future, 30 percent of its existing vehicles would eventually be equipped with the hardware, it said.
The new service will be enabled by a telematics control unit which will enable the automaker and its dealer network to access information about the car’s diagnostics and location, alerting the driver to any required maintenance work. “With connectivity we can provide better information and better service offerings to our customers,” Kent O’Hara, Nissan corporate vice president and head of its global aftersales division, told reporters at a briefing.
“We’ll know what’s wrong with that vehicle, we’ll know where the vehicle is, we’ll know what parts are needed for the vehicle … and we can provide convenient service and alternative transportation options.” He added that connectivity services and other new technologies would contribute 25 percent of the automaker’s aftersales revenues by 2022, from “low, single digits” at the moment.
Aftersales generally account for around 14 percent of automaker revenues, according to industry experts. O’Hara said that connectivity services would enable Nissan to “enjoy some growth in our retention of customers over what we experience today”. Nissan declined to offer pricing details on the device, but the company is focusing on marketing new technology in mass-market models. Many automakers often reserve sophisticated services and functions to higher-end models.
Earlier this year, Nissan launched a minivan in Japan which can self-drive on single-lane motorways and navigate congestion, while this month it launched its new gasoline-electric hybrid powertrain in its Note subcompact car for the Japanese market.
Reuters

Monday, November 7, 2016

Big Data could be the deciding factor between Hillary and Trump

Big Data could be the deciding factor between Hillary and Trump

Image: Reuters
By Muqbil Ahmar
It is going to be a photo finish, predict the psephologists. In this head to head, technology could play a major role, particularly Big Data, in deciding who gets to wear the crown eventually. While Trump has berated the technology, calling it “overrated,” the Hillary camp has gone all out to mine and collect data from every possible source: from voter registration and public records to social media activity. They are following up on the data-driven election strategy that Barack Obama built to great success while running for his second term. Many election analysts have credited Obama’s Big Data initiatives for his 2012 election win, saying that his tactics have set up a novel precedent for how election campaigns of the future would be run.
On one side, Hillary has stood her ground, with her campaign team developing and harnessing cutting-edge analytics to target swing voters. Trump hasn’t. While other candidates entered into arrangements and partnerships with data strategists, the sole exception had been Trump, who has instead relied heavily on his own personal appeal. He has, in fact, been critical of Big Data Analytics during his campaigns. “I’ve always felt it [Big Data] was overrated. Obama got the votes much more so than his data processing machine. And I think the same is true with me.” On the contrary, Trump has used social media and the traditional media coverage to promote himself and his opinions. Trump’s strategy of not leveraging Big Data may turn against him and put him at a disadvantage as compared to his more analytically-aware rival.
But now, in the last moments, Trump has had to eat his words. In the month of September alone, Trump paid a Big Data Analytics UK firm Cambridge Analytica $5 million to help target voters. The company has claimed that it has data on around 230 million adults in the USA and approximately 4000 “data points” on every one of them, including gym and club memberships, charity donations, and card transactions. All this is a last ditch effort to analyze the ultimate voter and his political leanings and how to make them change their minds.
In the 2012 presidential election campaign, Obama used path-breaking data analytics. First of all, he focused on swing or undecided voters. His strategy was simple: there is no point in wasting resources wooing those who have already made up their minds or are political loyalists—they would not vote favorably even in a zillion years. A core team of more than 100 data analysts ran 66,000 computer simulations every day. Analysts put together data from various sources: voter registration records, charities and donations, and public department information. In fact, third-party data was also bought (including data collected from social media).
Ironically, even when Trump has belatedly decided to go for a data-centric strategy, he has followed the trodden road—one that followed in 2012. He hasn’t taken into account that technology, particularly Big Data Analytics, has been innovating fast and moving into uncharted territories to mine and process data, making the 2012 strategies look old school. Over the past four years, private vendors and companies have developed and sharpened Big Data tools, adding Predictive Analytics, Artificial Intelligence, Machine Learning, Sentiment Analysis, and Language Processing to the potent mix to leverage the state-of-the-art technologies and expand their scope.
“Big Data techniques and data mining have hugely enhanced and expanded in scope. Machine learning is a particularly helpful in increasing the correlation between the data and its influence as there is self-learning inbuilt, so that the tool adapts and learns in every new situation. This can be applied to any situation and business,” said Shashank Dixit, CEO, Deskera, a Cloud technology firm that has recently developed its own Big Data tool.
Campaigners within both the parties agree that Democrats have a sizable lead in collection of information on voters and their smart use of those leads could be the difference, particularly if the contest goes to the wire. Clearly, Hillary Clinton has the advantage of inheriting the database that Obama built over two campaigns; however, it remains to be seen how successfully she is able to leverage it. However, there is one thing that has been established beyond doubt: the candidate with the smartest data wins.
Will Trump’s last minute effort prove sufficient or will it be a classic case of too little too late? Only time will tell, the results would anyway be out soon.
With over 10 years of experience in the field of journalism, the author is a technology evangelist and avid blogger.

Tuesday, November 1, 2016

Why big data leaders must worry about IoT security



By Mary Shacklett | October 28, 2016, 1:08 PM PST

The security risks associated with IoT devices cannot be ignored. If your big data plans include IoT devices, follow these four steps to reduce your chances of a security breach.

A series of distributed denial-of-service (DDoS) attacks powered by the malware botnet Mirai on October 21, 2016 disabled Dyn, the domain name system provider for hundreds of major websites, including Netflix, Twitter, and PayPal. The malware infected and spread through systems with the help of hacker-compromised web-connected cameras and digital recorders in consumer households, and security experts expressed their concerns about new threats from home electronics and the Internet of Things (IoT).

Big data leaders should take particular notice of this recent attack, because it highlights why security needs to be top of mind when incorporating IoT into analytics projects.


Research firm Gartner projects that 26 billion IoT devices will be installed by 2020. These IoT devices and sensors will be connected to freight containers, facility alarms, data centers, HVAC environmental monitoring equipment, hospital operating rooms, etc., and companies will be expected to do something with the information collected from these devices.

IoT applications that are already in the field include smart meters used by electric and gas utilities. Estimates are that by 2020, there will be over 900 million of these smart meters installed globally, with Asia leading the transition to smart energy grids, followed by Europe and North America. The cost of installing these smart meters is over $100 billion, but the projected financial benefits will reach $160 billion. So the return on investment (ROI) is there, but what else do companies have to worry about?

With smart meters, we're looking at millions of devices with physical exposure and the ability to inject software attacks from multiple points of entry. To a greater or lessor degree, this IoT exposure also applies to manufacturing, logistics, and other companies operating IoT devices at the edges of enterprises, and even to highly centralized companies where malware could leak in through an IoT-monitored HVAC or environmental monitoring device.

More about IoT security attacks and vulnerabilities

In December 2015, 30 of 135 power substations in the Ukraine were taken out for

Monday, October 24, 2016

Big Data could help optimistic tweeters with lower insurance rates

Big Data could help optimistic tweeters with lower insurance rates

Image Credits: REUTERS
When people take to Twitter to comment on the great evening they enjoyed with good food and wonderful friends, reducing their monthly insurance bill is probably the last thing on their mind. But such tweets could help insurers to price premiums for individuals, with research suggesting a direct link between positive posts and a reduced risk of heart disease. This could lead to future insurance cover based on “sentiment analysis”, in which Big Data and artificial intelligence make predictive models ever more accurate. Swiss Re says technological advances will cut the price of insurance protection and help individuals and firms make better decisions through programmes that offer advice and incentivise improvements in areas such as health and driving.
Swiss Re says technological advances will cut the price of insurance protection and help individuals and firms make better decisions through programmes that offer advice and incentivise improvements in areas such as health and driving. However, detractors fret that such developments could erode customers’ privacy or lead to increasingly personalised pricing, undermining the basic principle of

Sunday, October 16, 2016

The secrets to big data project success for small businesses



Image: iStock/stefanamer

By Mary Shacklett | October 14, 2016, 9:13 PM PST

Big data analytics can help small businesses level the playing field. Find out what the cloud has to do with it, and read tips on selecting the optimal analytics tools to use.


Companies that sell third-generation reporting products and many cloud solutions providers have expanded into the domain of big data and analytics, making these technologies more affordable and accessible to small companies. Unfortunately, many small companies don't know how to make the best use of these resources, or know how to change their operations so analytics can help their bottom line. Read about two small companies that have succeeded with big data.

Two big data success stories

Outdoor venues like the Point Defiance Zoo & Aquarium in Tacoma, Washington, rely on attendance to keep the doors open, and attendance is highly dependent on weather. The zoo's management worked with IBM and BrightStar Partners, an analytics firm, to come up with a better way of predicting outdoor zoo attendance for the purposes of budgeting and staffing. Historical attendance records for the zoo were parsed and then analyzed against years of detailed local climate data collected by the National Weather Service. This ultimately led to new insights that helped the zoo anticipate with surprising precision how many customers would show up on a given weekend. The analytics helped with staffing, predicting attendance, and launching promotions.

In Tucson, Arizona, Brian Janezic was used to going through cleaning supplies and vending machine items to determine what to order for his two self-service car wash locations. He installed sensors and collected IoT (Internet of Things) data from his drums of chemicals, which enabled him to automate the monitoring of chemical consumption and the triggering of reorder points. This saved him time and more efficiently managed costs of operation.

Guidelines for small businesses with big data projects

Since most analytics end up coming from data sources that small businesses are already familiar with, the keys to success depend on identifying a tightly defined business case that is calculated to bring specific results and on not making the initial project scope too big. Also, small businesses should look to augment corporate knowledge obtained from their internal systems and offline documentation with big data insights.



In addition, small companies can avoid expensive capital investments in hardware and software, since they can check out the offerings of cloud-based big data crunchers and analytics providers.

In the cloud market, there are pay-for-use and pay-by-subscription providers that help businesses get their non-digitalized data into digital form so it can be used in analytics. Other providers collect the data and combine it with publicly available information to help the small business owner better understand the market and their customers in order to make smarter business decisions. These providers can supply the small business owner with the reporting and query tools and dashboards so they can ask their own questions of the data. Other providers, like Google Analytics, offer free web traffic monitoring tools, metrics, and traffic sources, and share data about website visitors.


Ultimately, your business case and project will dictate the types of analytics tools you will use. The tools should meet these four vetting criteria:
the cost should be reasonable (preferably, you pay only for what you use);
the tools should be easy and intuitive to use, with very short learning curves;
the data that you provide to the process, along with any data provided by your vendor, should be data that you trust; and
the solutions must enable you to meet your goals.

Using these guidelines, small businesses can have successful big data projects, and better yet, help level the playing field in today's highly competitive marketplace.

The big picture on big data about the election from mobile photo social network Instagram.



Images: Hillary Clinton, Donald Trump | Instagram

By Dan Patterson | October 13, 2016, 10:50 AM PST



The 2016 election has sparked the most mobile, connected campaigns in history. Both campaigns use social media vociferously, and how each campaign uses the social web is a reflection of the campaign's personality and strategic style. Trump fires multimedia missives about an assortment of topics, while the Clinton campaign creates vertical social media tailored to specific networks with careful precision.

This year has also arguably been the most photographed election in history. Technologically, the 2008 and 2012 campaigns were defined by the emergence of social media, primarily Facebook and Twitter. In 2016, for both presidential campaigns, Instagram is an indispensable tool.

And for good reason: the social network acquired in 2012 for $1 billion in 2012 by Facebook is growing rapidly. As of June 2016, Instagram had over 500 million monthly active users, 100 million more active monthly users than September, 2015. This puts the company ahead of prominent social networks like Twitter, 320 million monthly users, and LinkedIn, 100 million monthly users. Facebook has approximately 1.57 billion monthly users.



Instagram recently improved its brand management and advertising brands. In September the company announced it had half a million advertisers, double what it had six months ago. An additional 1.5 million profiles had been converted to brands pages, giving companies access to

Big data applications are 10X more complex than regular apps, and developers often need to know a plethora of technologies just to make big data work.



Image: iStockphoto/SIphotography

By Matt Asay | October 13, 2016, 8:12 AM PST


Big data is still too difficult. Despite all the hype—and there has been lots and lots of hype—most enterprises still struggle to get value from their data. This led Dresner Advisory Services to conclude, "Despite an extended period of awareness building and hype, actual deployment of big data analytics is not broadly applicable to most organizations at the present time."

Ouch.

Some of this is a people problem. However persuasive the data, executives often prefer to ignore that data. But, a big part of the complexity in big data is about the software required to grok it all. Though Spark and other, newer systems have improved the trajectory, big data infrastructure remains way too hard, a point made astutely by Jesse Anderson.
This stuff is hard

People have long loomed as one of the biggest impediments to big data adoption. A 2015 Bain & Co. survey of senior IT executives found that 59% believed their companies lack the capabilities to make sense (and business) of their data. Speaking specifically of Hadoop, Gartner analyst Nick Heudecker suggested that "Thru 2018, 70% of Hadoop deployments will not meet cost savings & revenue generation objectives due to skills & integration challenges." Skills matter, in other words, and are in short supply.


Over time the skills gap will decrease, of course, but understanding the average Hadoop deployment, for example, is non-trivial, as Anderson noted. In his words, the complexity of big data comes down to two primary factors: "you need to know 10 to 30 different technologies, just to create a big data solution," and "distributed systems are just plain hard."

The question is why.

Anderson schematically represented the complexity of a typical mobile application versus a Hadoop-backed application, noting that the latter involves double the number of "boxes," or components. Expressed in plain English, however, "The 'Hello World' of a Hadoop solution is more complicated than other domains' intermediate to advanced setups."


Compounding the difficulty, Anderson said, is the need to understand the wide array of systems involved. You might need to know 10 technologies to build a big data application, for example, but that likely requires you to have some familiarity with another 20 technologies simply to know which one to use in a given situation. Otherwise, for example, how are you going to know to use MongoDB instead of Hbase? Or Cassandra? Or neo4j?


Add to this the complexity of running it all in a distributed system, and it's no wonder that the skills shortage for big data persists.
The easy way out

One way that enterprises are trying to minimize the complexity inherent in big data build-outs is by turning to the public cloud. According to a recentDatabricks survey of Apache Spark users, deployment of Spark to the public cloud has ballooned 10% over the last year to 61% of total deployments overall. Instead of cumbersome, inflexible on-premises infrastructure, the cloud allows for flexibility and, hence, agility.


It does not, however, remove the complexity of the technologies involved. The same hard choices about this or that database or message broker remain.

Such choices, and the complexity therein, isn't going away anytime soon. Companies like Cloudera and Hortonworks have arisen to try to streamline those choices, tidying them up into stacks, but they still essentially provide tools that need to be understood in order to be useful. Amazon Web Services is going a step further with its Lambda service, which allows developers to focus on writing their application code while AWS takes care of all the underlying infrastructure.

But the next step is to pre-fab the application for the end user entirely, which is what former Wall Street analyst Peter Goldmacher dubbed a much bigger opportunity that selling infrastructure components. In his words, one major category of "winners [is] the Apps and Analytics vendors that abstract the complexity of working with very complicated underlying technologies into a user friendly front end. The addressable audience of business users is exponentially larger than the market for programmers working on core technology."

This is where the market needs to get to, and fast. We're nowhere near done. For every Uber that is able to master all the underlying big data technologies to up-end industries there are hundreds of traditional companies that simply want to reinvent themselves and need someone to make their data more actionable. We need this category of vendor to emerge. Now.

Tuesday, October 4, 2016

Big data, business analytics to hit $203 billion by 2020, says IDC report

bigdatamarket.jpg

By   | October 3, 2016, 9:41 AM PST
A spending guide by IDC forecasts that the big data market will grow at a compound annual growth rate of 11.7% though 2020, led by five distinct industries.




The big data and business analytics (BDA) market is predicted to hit $203 billion in the year 2020, up from $130.1 billion in 2016, according to research firm IDC. The findings, released Monday, come from IDC's Worldwide Semiannual Big Data and Analytics Spending Guide.
"The availability of data, a new generation of technology, and a cultural shift toward data-driven decision making continue to drive demand for big data and analytics technology and services," said Dan Vesset, group vice president of analytics and information management at IDC, in a press release.
One of the most interesting takeaways from the report were the

Saturday, October 1, 2016

6 don'ts when leading big data projects -Here's what not to do during big data projects to keep risks low and success rates high.-




Image: iStock

By Mary Shacklett | September 30, 2016, 2:41 PM PST

Most organizations have a set of big data best practices they have formulated from their successful project work. An equally important list is the pitfalls that organizations should stay away from when it comes to big data and analytics. Here are six don'ts to keep in mind during your big data projects.


1: Swing for the fences

The most successful big data initiatives build a strong foundation for big data and analytics and use them. The best way to do this is by creating a constant path of new big data deliverables that incrementally and continuously improve the organization's ability to tackle strategies and
Related Posts Plugin for WordPress, Blogger...