Ten big data case studies in a nutshell, big data examples.#Big #data #examples

No Comments

#

Ten big data case studies in a nutshell

Big data examples

What are companies such as Macy’s, Tesco, American Express and Wal-Mart Stores doing with big data? The Data Mill reports.

FROM THE ESSENTIAL GUIDE:

Big data exploration and analytics for CIOs: Oh, the places you’ll go

GUIDE SECTIONS

  • More articles from this section:
  • Prescriptive analytics: A roadway to optimal data usage
  • When going prescriptive, start by assessing your business problems
  • Man + machine: A necessity for big data variety
  • A data-crunching dilemma calls for a community cloud
  • Startup turns to cloud analytics to understand e-commerce data
  • Share this item with your network:
  • Dealing with the Challenges of B2B Data Transformation IBM
  • Three Use Cases for Interactive Data Discovery and Predictive Analytics SAS Institute Inc.
  • See More
  • Privacy and Big Data ComputerWeekly.com
  • Editorial E-Zine Demo TechTarget

You haven’t seen big data in action until you’ve seen Gartner analyst Doug Laney present 55 examples of big data case studies in 55 minutes. It’s kind of like The Complete Works of Shakespeare, Laney joked at Gartner Symposium, though less entertaining and hopefully more informative. (Well, maybe, for this tech crowd.) The presentation was, without question, a master class on the three Vs definition of big data: Data characterized by increasing variety, velocity and volume. It’s a description, by the way, that Laney — who also coined the term infonomics — floated way back in 2001.

Big data examples

Big data examples

What should be in a CIO’s IT strategic plan?

This complimentary document comprehensively details the elements of a strategic IT plan that are common across the board – from identifying technology gaps and risks to allocating IT resources and capabilities. The SearchCIO.com team has compiled its most effective, most objective, most valued feedback into this single document that’s guaranteed to help you better select, manage, and track IT projects for superior service delivery.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

The 55 examples are not intended to intimidate, but instruct. Laney told the audience not to feel overwhelmed, but to home in on the big data case studies that might improve business performance at their own companies: Yes, I know you’re in industry x, but there are tremendous ideas that come from other industries that you need to consider adapting and adopting for your own industry, he said.

Here are 10 of them:

1. Macy’s Inc. and real-time pricing. The retailer adjusts pricing in near-real time for 73 million (!) items, based on demand and inventory, using technology from SAS Institute.

2. Tipp24 AG, a platform for placing bets on European lotteries, and prediction. The company uses KXEN software to analyze billions of transactions and hundreds of customer attributes, and to develop predictive models that target customers and personalize marketing messages on the fly. That led to a 90% decrease in the time it took to build predictive models. SAP is in the process of acquiring KXEN. That’s probably a great move by SAP to fill a predictive analytics gap they’ve long had, Laney said.

3. Wal-Mart Stores Inc. and search. The mega-retailer’s latest search engine for Walmart.com includes semantic data. Polaris, a platform that was designed in-house, relies on text analysis, machine learning and even synonym mining to produce relevant search results. Wal-Mart says adding semantic search has improved online shoppers completing a purchase by 10% to 15%. In Wal-Mart terms, that is billions of dollars, Laney said.

4. Fast food and video. This company (Laney wasn’t giving up who) is training cameras on drive-through lanes to determine what to display on its digital menu board. When the lines are longer, the menu features products that can be served up quickly; when the lines are shorter, the menu features higher-margin items that take longer to prepare.

5. Morton’s The Steakhouse and brand recognition. When a customer jokingly tweeted the Chicago-based steakhouse chain and requested that dinner be sent to the Newark airport, where he would be getting in late after a long day of work, Morton’s became a player in a social media stunt heard ’round the Interwebs. The steakhouse saw the tweet, discovered he was a frequent customer (and frequent tweeter), pulled data on what he typically ordered, figured out which flight he was on, and then sent a tuxedo-clad delivery person to serve him his dinner. Sure, the whole thing was a publicity stunt (that went viral), but that’s not the point. The question businesses should be asking themselves: Is your company even capable of something like this? Laney said.

6. PredPol Inc. and repurposing. The Los Angeles and Santa Cruz police departments, a team of educators and a company called PredPol have taken an algorithm used to predict earthquakes, tweaked it and started feeding it crime data. The software can predict where crimes are likely to occur down to 500 square feet. In LA, there’s been a 33% reduction in burglaries and 21% reduction in violent crimes in areas where the software is being used.

Previously on The Data Mill

MetLife fires up Synapse and JSON to recruit rock-star developers

The state of the digital enterprise at Gartner Symposium

7. Tesco PLC and performance efficiency: The supermarket chain collected 70 million refrigerator-related data points coming off its units and fed them into a dedicated data warehouse. Those data points were analyzed to keep better tabs on performance, gauge when the machines might need to be serviced and do more proactive maintenance to cut down on energy costs.

8. American Express Co. and business intelligence. Hindsight reporting and trailing indicators can only take a business so far, AmEx realized. Traditional BI [business intelligence] hindsight-oriented reporting and trailing indicators aren’t moving the needle on the business, Laney said. So AmEx started looking for indicators that could really predict loyalty and developed sophisticated predictive models to analyze historical transactions and 115 variables to forecast potential churn. The company believes it can now identify 24% of Australian accounts that will close within the next four months.

9. Express Scripts Holding Co. and product generation. Express Scripts, which processes pharmaceutical claims, realized that those who most need to take their medications were also those most likely to forget to take their medications. So they created a new product: Beeping medicine caps and automated phone calls reminding patients it’s time to take the next dose.

10. Infinity Property Casualty Corp. and dark data. Laney defines dark data as underutilized information assets that have been collected for single purpose and then archived. But given the right circumstances, that data can be mined for other reasons. Infinity, for example, realized it had years of adjusters’ reports that could be analyzed and correlated to instances of fraud. It built an algorithm out of that project and used the data to reap $12 million in subrogation recoveries.

Welcome to The Data Mill, a weekly column devoted to all things data. Heard something newsy (or gossipy)? Email me or find me on Twitter at @TT_Nicole.


Categories: News Tags: Tags: , ,

How to get a sticky load balancer in Windows Azure #load #balancer #big #ip

No Comments

#

Windows Azure has a load balancer that you can use for free. It is used mostly to load balance port 80 (aka web traffic) across a group of identically configured web servers, although it can be used to load balance any TCP/UDP port. It is not a sticky load balancer.

Most load balancers in the corporate world have a setting called ‘sticky sessions.’ This feature will route a user back to the same web server over and over again. This is done with performance in mind. People think that if the same user goes to the same server, the application will run faster because their data is probably cached on that particular server.

I think this is a fallacy in most cases, and that sticky sessions in a load balancer is a bad smell hinting at some rotting architecture. When I see sticky sessions somewhere, it is usually because of one of the following is true:

To me this is a big crime in and of itself. You should not be using session state, and if you are, it must be kept to a minimum. Session state turns out to be a huge crutch that helps people get their heads around how the web is truly stateless. You will see this most often with people that don’t understand the web at a low level, and probably spent their early part of their careers architecting client/server applications. Or they are just lazy.

The second half to this first problem is that they aren’t sharing said evil session state across their servers. For crying out loud people! If you insist on having session state, please for the love of binary share that state amongst your servers.

This sharing is easy to do with the variety of providers available to you, most are ‘out of the box’. Sharing state means that ASP.NET will read/write the state to a central location, instead of the in-process on box memory. For example, you can share it in a SQL box, or in off box memory. You can share it in Azure storage, or even AppFabric. You can share it with a box. You can share it with a Fox. Ok, not a fox. All by easily changing a line of configuration in your application.

Avoid bloated session state, and share with your friends.

If you are sharing your state amongst the web server group, then you don’t need a sticky load balancer, since no matter which server the user goes to they can get their state.

2. They are trying to shoehorn a stateful application into the web

The web isn’t stateful. Embrace it and move on. Every time a browser makes a connection to a web server it’s starting from scratch. In a natural call (without any crutches) all the server has to go on is the user agent info (browser type, ip address, etc.) and the contents of the requested URL. That’s it.

Over the years, especially early on, our industry has adapted several crutches to help get us around this statelessness that bothers us so. All of them have lead to horror when over used, so a light touch goes a long way. Those are namely session state (see above) and cookies.

Cookies are in some ways the same as session state. They save state for the application between calls. But instead of the storing the state on the server, it is stored on the client. Many people wave this off. “Oh, it’s not state, just some bits of data.”

The Real Problem

The real problem with sticky sessions is that it leads to fragility in your web cluster/farm/group/collective. If user A is always going to Server x, and all of their evil state, and data is on that server, then you have introduced a single dependency on your user. All of our infrastructure efforts always lead us to high availability and reliability. This sticky load balancer throws it’s hat in our faces and says, with an outrageous accent, “I don’t think so!”

All of a sudden, all of our hard work in building our web farm is thrown out the window because of some lazy architecture decisions. Your group of super servers is reduced to a simple group of individual servers.

When that one server does go offline (and it will because failure happens, embrace it) you will lose all those users. They will lost their state, and data. They will hit that button on their screen, and when the browser round trips and refreshes (because if you’re doing this, you likely aren’t doing SPA) the browser will return an error. That person won’t get their star trek pizza cutter, will have a bad experience, and their day will spiral out of control. They will go home and likely kick their neighbors dog. All because you used a sticky load balancer. Note: please do not kick anyone’s dog.

A secondary issue is load leveling. A great use of a load balancer that many people aren’t aware of is sarcasm that it can level the load /sarcasm across several servers. In a sticky LB scenario, it is easy for server x to get really busy, while server n is lonely and sits idle and unused. This always leads to resentment on the part of the server doing all the work, so try to avoid this.

If you were running your load balancer in a normal way, your users and their requests would be flooding evenly (fairly so anyway) across all of those web servers you paid for, giving the user a great experience. A server could go down, and the users work would be picked up by any of the other servers on the next trip. Boom goes the dynamite.

Do you really hate state?

I didn’t mean for this post to turn into a hate piece on state (session or cookie). I will soon get to why I wrote this post, but I wanted to get off my chest how I have seen state be abused before. Like I mentioned above, I don’t think you are evil for using it. An experienced and critically thinking developer who knows when to break the rules is allowed to break the rules.

So, what does this have to do with Windows Azure?

Good question my good man! In Windows Azure, the load balance we started talking about is a non-sticky load balancer. Yes, if you start up the local simulator, and try some tests you won’t see that happening. If you start up a group of web roles in Cloud Services, and try hitting F5 really fast in IE you won’t see balancing either.

That’s because there is some intelligence in the balancer, and because you can’t cause enough traffic with your own keyboard. Nor my super awesome-backlit-mechanical-Cherry-MX-red-switch-keyboard-that-the-neighbors-can-hear-when-I-am-typing-keyboard.

But, some people NEED a sticky balancer. They just NEED it. Either they are migrating something that is built that way, and they just can’t make the investment to fix it yet, or… that’s the only reason I can come up with.

So, here is you escape hatch, IIS Application Request Routing. ARR adds a layer of load balancing in software at the server level, instead of lower down the network stack. This lets it use some intelligence as to what the software is doing.

The IIS ARR can be used on-premises, or in Windows Azure. Basically your front-end your application with servers running ARR. Officially, ARR is:

“IIS Application Request Routing (ARR) 2.5 enables Web server administrators, hosting providers, and Content Delivery Networks (CDNs) to increase Web application scalability and reliability through rule-based routing, client and host name affinity, load balancing of HTTP server requests, and distributed disk caching. With ARR, administrators can optimize resource utilization for application servers to reduce management costs for Web server farms and shared hosting environments.”

So, if you need sticky load balancing, or maybe some smarter session routing in your application, either on-premises, or in the cloud, check out ARR. You can read all about it at the IIS website.

ARR also has some great caching features included as well. ARR Cache would be a good alternative to AppFabric cache, if you don’t want to run that.

Share this:


Categories: News Tags: Tags: , , ,

Big Data Training #big #data #with #hadoop

No Comments

#

Big Data Training

Big Data Introductory Training

Introduction to Big Data

  • Foundation
  • Course 1250
  • 3 Days
  • Live, Online In-class

Big Data Insights, Technologies & Trends

  • Foundation
  • Course 4500
  • 1 Day
  • Live, Online

Big Data Computation Training

Hadoop Programming with Java for Big Data Solutions

  • Intermediate
  • Course 1251
  • 4 Days
  • Live, Online In-class

Apache Spark Programming with Scala for Big Data Solutions

  • Intermediate
  • Course 1262
  • 4 Days
  • Live, Online In-class

Hadoop Programming Essentials

  • Intermediate
  • Course 4504
  • 1 Day
  • Private Team Training

Big Data Administration Training

Microsoft Certification Perform Data Engineering on Microsoft HD Insight (20775) NEW!

  • Intermediate
  • Course 8491
  • 5 Days
  • Live, Online In-class

Hadoop Architecture & Administration for Big Data Solutions

  • Intermediate
  • Course 1252
  • 4 Days
  • Live, Online In-class

Planning and Setting Up Your First Hadoop Cluster

  • Intermediate
  • Course 4502
  • 1 Day
  • Private Team Training

Advanced Hadoop Configuration for High-Availability Clusters

  • Intermediate
  • Course 4503
  • 1 Day
  • Private Team Training

NoSQL Training

Building Apache Cassandra Databases

  • Intermediate
  • Course 1260
  • 3 Days
  • Live, Online In-class

Building Enterprise Solutions with MongoDB

  • Intermediate
  • Course 1261
  • 4 Days
  • Live, Online In-class

Data Analytics Courses

Microsoft Certification Perform Cloud Data Science with Azure Machine Learning (20774) NEW!

  • Intermediate
  • Course 8490
  • 5 Days
  • Live, Online In-class

Microsoft Certification Analyzing Big Data with Microsoft R (20773) NEW!

  • Intermediate
  • Course 8489
  • 3 Days
  • Live, Online In-class

Operationalize Cloud Analytics with Microsoft Azure (55224-2) NEW!

  • Intermediate
  • Course 8488
  • 2 Days
  • Live, Online In-class

Microsoft Azure Big Data Analytics Solutions (55224-1) NEW!

  • Intermediate
  • Course 8487
  • 2 Days
  • Live, Online In-class

Introduction to Data Science for Big Data Analytics

  • Foundation
  • Course 1253
  • 5 Days
  • Live, Online In-class

Extracting Business Value From Big Data With Pig and Hive

  • Intermediate
  • Course 1254
  • 4 Days
  • Live, Online In-class

Introduction to R for Data Analytics

  • Foundation
  • Course 5045
  • 1 Day
  • Live, Online

Introduction to Python for Data Analytics

  • Foundation
  • Course 4509
  • 1 Day
  • Live, Online

Big Data Visualizations Training

Tableau: Presenting Analytic Visualizations

  • Foundation
  • Course 1256
  • 3 Days
  • Live, Online In-class

Activity-Based Intelligence Training

Executive Overview of Activity-Based Intelligence (ABI)

  • Advanced
  • Course 8050
  • 1 Day
  • Private Team Training

Essentials of Activity-Based Intelligence (ABI)

  • Foundation
  • Course 8101
  • 1 Day
  • Private Team Training

Developing Activity-Based Intelligence (ABI) Applications

  • Intermediate
  • Course 8201
  • 5 Days
  • Private Team Training

Applying Activity-Based Intelligence (ABI) Methods

  • Intermediate
  • Course 8205
  • 5 Days
  • Private Team Training

Categories: News Tags: Tags: , , ,

8 Big Data Solutions for Small Businesses #big #data #start #ups

No Comments

#

8 Big Data Solutions for Small Businesses

Big Data isn t just for big businesses with big budgets. Today, small business, too, can reap the benefits of the massive amounts of online and offline information to make wise, data-driven decisions to grow their businesses. Most Big Data discussions concern enterprises that have all the resources to hire data scientists and research firms. But if you know where to look, there are several ways that your small business can gather, analyze and make sense of data you already have without breaking the bank. Here are eight Big Data solutions for small businesses.

1. ClearStory Data

Analyzing complex business intelligence doesn t have to be rocket science. ClearStory Data offers advanced data mining and analytics tools that also present information in a simple, easy to understand way.

ClearStory Data works by combining your business s internal data with publicly available information to help you make better business decisions. These insights are displayed using the StoryBoard feature, which lets you create graphs, story lines and interactive visuals right from the ClearStory dashboard. It also comes with collaboration features that enable team discussion, for instance, by commenting on individual StoryBoards, much like you would on social media. In addition to business data, ClearStory can also provide department-specific data, including marketing, sales, operations and customer analytics. The platform also covers a wide range of industries, such as retail, food and beverage, media and entertainment, financial services, manufacturing, consumer packaged goods, healthcare, pharmaceutical and more. Contact ClearStory Data for pricing information.

2. Kissmetrics

Looking to increase your marketing ROI? Kissmetrics. a popular customer intelligence and web analytics platform, could be your best friend. The platform aims to help businesses optimize their digital marketing by identifying its best customers and increasing conversions.

Unlike traditional web analytics tools, Kissmetrics goes beyond tracking basic metrics like pageviews, referrals and demographic information. Kissmetrics specifically tracks visitors, particularly for insights that can be used for better segmentation and more successful marketing campaigns. Kissmetrics also offers engagement tools to help increase sales, such as the ability to create triggers and design styles that make the most out of customer behaviors. All of this means more conversions, less churning customers who quickly leave your site and, ultimately, higher ROIs. In addition, Kissmetrics offers educational resources to help business improve marketing campaigns, such as marketing webinars, how-to guides, articles and infographics. Kissmetrics plans starts at $120 per month. [See Related Story:Best CRM Software for Small Business]

3. InsightSquared

The tools you already use provide another rich source of data. This doesn t mean you have to waste time mining your own data and arduously analyzing it using one spreadsheet after another. Instead, InsightSquared connects to popular business solutions you probably already use such as Salesforce, QuickBooks, ShoreTel Sky, Google Analytics, Zendesk and more to automatically gather data and extract actionable information.

For instance, using data from customer relationship (CRM) software, InsightSquared can provide a wealth of small business sales intelligence, such as pipeline forecasting, lead generation and tracking, profitability analysis, and activity monitoring. It can also help businesses discover trends, strengths and weaknesses, sales team wins and losses, and more. In addition to sales tools, InsightSquared s suite of products also includes marketing, financial, staffing and support analytics tools. InsightSquared starts at $65 per user per month.

Editor s Note: Looking for CRM software for your business? If you re looking for information to help you choose the one that s right for you, use the questionnaire below to have our sister site, BuyerZone, provide you with information from vendors for free:

4. Google Analytics

You don t need fancy, expensive software to begin gathering data. It can start from an asset you already have your website. Google Analytics. Google s free Web-traffic-monitoring tool, provides all types of data about website visitors, using a multitude of metrics and traffic sources.

With Google Analytics, you can extract long-term data to reveal trends and other valuable information, so you can make wise, data-driven decisions. For instance, by tracking and analyzing visitor behavior such as where traffic is coming from, how audiences engage and how long visitors stay on a website (known as bounce rates) you can make better decisions when striving to meet your website s or online store s goals. Another example is analyzing social media traffic, which will allow you to make changes to your social media marketing campaigns based on what is and isn t working. Studying mobile visitors can also help you extract information about customers browsing your site using their mobile devices, so you can provide a better mobile experience. Here s how to sign up for Google Analytics .

5. IBM s Watson Analytics

While many Big Data solutions are built for extremely knowledgeable data scientists and analysts, IBM s Watson Analytics makes advanced and predictive business analytics easily accessible to small businesses. The platform doesn t require any requisite skills of using complex data mining and analysis systems, but automates the process instead. This self-service analytics solution includes a suite of data access, data refinement and data warehousing services, giving you all the tools you need to prepare and present data yourself in a simple and actionable way to guide decision-making.

Unlike other analytics solutions that focus on one area of business, Watson Analytics unifies all your data analysis projects into a single platform it can be used for all types of data analysis, from marketing to sales, finance, human resources and other parts of your operations. Its natural language technology helps businesses identify problems, recognize patterns and gain meaningful insights to answer key questions like what ultimately drive sales, which deals are likely to close, how to make employees happy and more.

6. Canopy Labs

Big Data won t just help you make better business decisions; it can help you predict the future, too. Canopy Labs. a customer analytics platform, uses customer behavior, sales trends and predictive behavioral models to extract valuable information for future marketing campaigns and to help you discover the most opportune product recommendations.

One of Canopy Labs standout features is the 360-degree Customer View, which shows comprehensive data about each individual customer. Its purpose is two-fold: first, it reveals each customers standing, such as lifetime value, loyalty and engagement level, as well as purchase histories, email behaviors and other metrics and this shows which customers are profitable and worth reaching out to. Second, with this information, businesses can better create personalized offers, track customer responses and launch improved outreach campaigns. Canopy Labs handles the complex, technical side of Big Data, so all you have to focus on are your customers. The service is free for up to 5,000 customers. Paid plans for additional customers start at $250 a month.

7. Tranzlogic

It s no secret that credit card transactions are chock full of invaluable data. Although access was once limited to companies with significant resources, customer intelligence company Tranzlogic makes this information available to small businesses without the big business budget.

Tranzlogic works with merchants and payment systems to extract and analyze proprietary data from credit card purchases. This information can then be used to measure sales performance, evaluate customers and customer segments, improve promotions and loyalty programs, launch more-effective marketing campaigns, write better business plans, and perform other tasks that lead to smarter business decisions. Moreover, Tranzlogic requires no tech smarts to get started it is a turnkey program, meaning there is no installation or programming required. Simply log in to access your merchant portal. Contact Tranzlogic for pricing information.

8. Qualtrics

If you don t currently have any rich sources for data, conducting research may be the answer. Qualtrics lets businesses conduct a wide range of studies and surveys to gain quality insights to guide data-driven decision making.

Qualtrics offers three types of real-time insights: customer, market and employee insights. To gain customer insight, use Qualtrics survey software for customer satisfaction, customer experience and website feedback surveys. To study the market, Qualtrics also offers advertising testing, concept testing and market research programs. And when it comes to your team, Qualtrics can help conduct employee surveys, exit interviews and reviews. Other options include online samples, academic research and mobile surveys. Contact Qualtrics to request pricing.


Categories: News Tags: Tags: , , ,

BIGDACI 2017 #big #data #and #data #mining

No Comments

#

NEW Keynote Speaker (confirmed):
Professor Nir Shavit, MIT (Massachusetts Institute of Technology, USA

The conference is expected to provide an opportunity for the researchers to meet and discuss the latest solutions, scientific results and methods in solving intriguing problems in the fields of Big Data Analytics, Intelligent Agents and Computational Intelligence.

The conference proceedings will be submitted to the following indexing services (among others):

NEW Selected authors of best papers will be invited to submit extended versions of their papers
to selected journals including journals from

NEW The best paper authors will be invited to publish extended versions of their papers in the
IADIS Journal on Computer Science and Information Systems (ISSN: 1646-3692) that willl be indexed by
Emerging Sources Citation Index by Thomson Reuters

Conference Official Language: English.

This is a blind peer-reviewed conference.

Conference Poster in PDF


Categories: News Tags: Tags: , , ,

Data-driven innovation for growth and well-being #electronic #commerce, #e-commerce, #information #economy, #internet #economy, #open #internet,

No Comments

#

Digital economy

Data-driven innovation for growth and well-being

Data-driven innovation forms a key pillar in 21st century sources of growth. The confluence of several trends, including the increasing migration of socio-economic activities to the Internet and the decline in the cost of data collection, storage and processing, are leading to the generation and use of huge volumes of data commonly referred to as big data . These large data sets are becoming a core asset in the economy, fostering new industries, processes and products and creating significant competitive advantages. For instance:

  • In business, data exploitation promises to create value in a variety of operations, from the optimisation of value chains in global manufacturing and services more efficient use of labour and tailored customer relationships.
  • The adoption of smart-grid technologies is generating large volumes of data on energy and resource consumption patterns that can be exploited to improve energy and resource efficiency.
  • The public sector is also an important data user but also a key source of data. Greater access to and more effective use of public-sector information (PSI), as called for by the 2008 OECD Council Recommendation on PSI. can generate benefits across the economy.
  • Greater access and use of data creates a wide array of policy issues, such as privacy and consumer protection, open data access, skills and employment, and measurement to name a few.

    Since 2011, the OECD has been undertaking extensive analysis on the role of data in promoting innovation, growth and well-being within its multi-disciplinary project on New Sources of Growth: Knowledge-Based Capital (KBC). Objectives include:

    • Improving the evidence base on the role of data for promoting growth and well-being, and
  • Providing policy guidance on how to maximize the benefits of the data-driven economy, while mitigating the associated risks.
  • The project encompasses several building blocks:

    • The new data-driven era of scientific discovery
    • The role of data for enhancing health outcomes
    • Harnessing data for better governance
    • Cloud computing. analytics and other key enablers
    • Skills and other implications for employment
    • Ensuring trust in the data-driven economy
    • Measuring investments in data as knowledge-based capital

    Upcoming and recent work

    This new OECD project analyses how enhanced access to data can maximize the social and economic value of data, and at the same time address legitimate concerns of individuals and organisations.

    Four approaches for enhancing access to data will be covered in more detail: open data, community-based data sharing agreements, data portability, and data markets.

    The project will assess the social and economic costs and benefits of each approach as well as policy challenges that need to be addressed to foster the coherence of data governance frameworks applied across application areas and sectors.

    The OECD will hold an expert workshop on 2-3 October 2017 to kick-start the project. It aims at moving the policy agenda further by filling existing knowledge gaps and in particular identifying best practices in the public and private sector.

    This Recommendation of the OECD Council calls upon countries to develop and implement health data governance frameworks that secure privacy while enabling health data uses that are in the public interest. It is structured according to twelve high-level principles, ranging from engagement of a wide range of stakeholders, to effective consent and choice mechanisms to the collection and use of personal health data, to monitoring and evaluation mechanisms. These principles set the conditions to encourage greater cross-country harmonisation of data governance frameworks so that more countries are able to use health data for research, statistics and health care quality improvement, as well as for international comparisons.

    See also the report: Health Data Governance – Privacy, Monitoring and Research

    Participants at the November 2016 Forum agreed that advanced artificial intelligence is already here and that there are few limits to what it will be able to do. There was a call to focus on ‘applied AI’ that is designed to accomplish a specific problem-solving or reasoning task. However, several participants felt that policy-makers could not ignore the possibility of a (hypothetical) “artificial general intelligence” (AGI) whereby machines would become capable of general intelligent action, like a human being.

    Today, the generation and use of huge volumes of data are redefining our intelligence capacity and our social and economic landscapes, spurring new industries, processes and products, and creating significant competitive advantages. In this sense, data-driven innovation (DDI) has become a key pillar of 21st-century growth, with the potential to significantly enhance productivity, resource efficiency, economic competitiveness, and social well-being.

    Greater access to and use of data create a wide array of impacts and policy challenges, ranging from privacy and consumer protection to open access issues and measurement concerns, across public and private health, legal and science domains. The report Data-driven Innovation: Big Data for Growth and Well-being aims to improve the evidence base on the role of DDI for promoting growth and well-being, and provide policy guidance on how to maximise the benefits of DDI and mitigate the associated economic and societal risks.

    Dementia Research and Care: Can Big Data Help?

    OECD countries are developing strategies to improve the quality of life of those affected by dementia and to support long-term efforts for a disease-modifying therapy or cure.

    This report follows a September 2014 workshop that aimed to advance international discussion of the opportunities and challenges, as well as successful strategies, for sharing and linking the massive amounts of population-based health and health care data that are routinely collected (broad data) with detailed clinical and biological data (deep data) to create an international resource for research, planning, policy development, and performance improvement. The workshop sought to provide new insights into the opportunities and challenges in making broad and deep data a reality from funding to data standards, to data sharing, to new analytics, to protecting privacy, and to engaging with stakeholders and the public.


    How can I upload files asynchronously? Stack Overflow #where #can #i #upload #big #files

    No Comments

    #

    2017 Update: It still depends on the browsers your demographic uses.

    An important thing to understand with the “new” HTML5 file API is that is wasn’t supported until IE 10. If the specific market you’re aiming at has a higher-than-average prepensity toward older versions of Windows, you might not have access to it.

    Going into 2017, about 5% of browsers are one of IE 6, 7, 8 or 9. If you head into a big corporation (eg this is a B2B tool, or something you’re delivering for training) that number can rocket. Just a few months ago —in 2016— I dealt with a company using IE8 on over 60% of their machines.

    So before you do anything: check what browser your users use. If you don’t, you’ll learn a quick and painful lesson in why “works for me” isn’t good enough in a deliverable to a client.

    My answer from 2008 follows.

    However, there are viable non-JS methods of file uploads. You can create an iframe on the page (that you hide with CSS) and then target your form to post to that iframe. The main page doesn’t need to move.

    It’s a “real” post so it’s not wholly interactive. If you need status you need something server-side to process that. This varies massively depending on your server. ASP.NET has nicer mechanisms. PHP plain fails, but you can use Perl or Apache modifications to get around it.

    If you need multiple file-uploads, it’s best to do each file one at a time (to overcome maximum file upload limits). Post the first form to the iframe, monitor its progress using the above and when it has finished, post the second form to the iframe, and so on.

    Or use a Java/Flash solution. They’re a lot more flexible in what they can do with their posts.

    this is quite an old answer, but it was a bit misleading. IE supported XHR natively as far back as IE7, and supported it through ActiveX as far back as IE5. w3schools.com/ajax/ajax_xmlhttprequest_create.asp. The practical way of doing this was certainly targeting flash (shockwave) components, or rolling out a Flash/ActiveX (Silverlight) control. If you can originate a request and handle the response via javascript, it s ajax. though, having said that, ajax is synonymous with xhr, but it doesn t itself describe the underline mechanism/components that delivers/exchanges the payload. Brett Caswell Oct 29 ’15 at 14:54

    This AJAX file upload jQuery plugin uploads the file somehwere, and passes the response to a callback, nothing else.

    • It does not depend on specific HTML, just give it a input type=”file”
    • It does not require your server to respond in any particular way
    • It does not matter how many files you use, or where they are on the page

    — Use as little as —

    Asynchronous File Upload

    With HTML5

    You can upload files with jQuery using the $.ajax() method if FormData and the File API are supported (both HTML5 features).

    You can also send files without FormData but either way the File API must be present to process files in such a way that they can be sent with XMLHttpRequest (Ajax).

    Fallback

    When HTML5 isn’t supported (no File API ) the only other pure JavaScript solution (no Flash or any other browser plugin) is the hidden iframe technique, which allows to emulate an asynchronous request without using the XMLHttpRequest object.

    It consists of setting an iframe as the target of the form with the file inputs. When the user submits a request is made and the files are uploaded but the response is displayed inside the iframe instead of re-rendering the main page. Hiding the iframe makes the whole process transparent to the user and emulates an asynchronous request.

    If done properly it should work virtually on any browser, but it has some caveats as how to obtain the response from the iframe.

    In this case you may prefer to use a wrapper plugin like Bifröst which uses the iframe technique but also provides a jQuery Ajax transport allowing to send files with just the $.ajax() method like this:

    Plugins

    Bifröst is just a small wrapper that adds fallback support to jQuery’s ajax method, but many of the aforementioned plugins like jQuery Form Plugin or jQuery File Upload include the whole stack from HTML5 to different fallbacks and some useful features to ease out the process. Depending on your needs and requirements you might want to consider a bare implementation or either of this plugins.

    JavaScript

    Explanation

    The simplest and most robust way I have done this in the past, is to simply target a hidden iFrame tag with your form – then it will submit within the iframe without reloading the page.

    That is if you don’t want to use a plugin, JavaScript or any other forms of “magic” other than HTML. Of course you can combine this with JavaScript or what have you.

    You can also read the contents of the iframe ( onLoad() ) for server error or success response and then output that to user.

    A wee update, if relying on the onload event of an iframe during a download don’t count on it working in Chrome. For this the only solution I found was to use a cookie, ugly but it worked.

    answered Jun 26 ’14 at 4:43

    I’ve written this up in a Rails environment. It’s only about five lines of JavaScript, if you use the lightweight jQuery-form plugin.

    The challenge is in getting AJAX upload working as the standard remote_form_for doesn’t understand multi-part form submission. It’s not going to send the file data Rails seeks back with the AJAX request.

    That’s where the jQuery-form plugin comes into play.

    Here’s the Rails code for it:

    Here’s the associated JavaScript:

    And here’s the Rails controller action, pretty vanilla:

    I’ve been using this for the past few weeks with Bloggity, and it’s worked like a champ.

    answered Aug 13 ’09 at 22:44

    Simple Ajax Uploader is another option:

    • Cross-browser — works in IE7+, Firefox, Chrome, Safari, Opera
    • Supports multiple, concurrent uploads — even in non-HTML5 browsers
    • No flash or external CSS — just one 5Kb Javascript file
    • Optional, built-in support for fully cross-browser progress bars (using PHP’s APC extension)
    • Flexible and highly customizable — use any element as upload button, style your own progress indicators
    • No forms required, just provide an element that will serve as upload button
    • MIT license — free to use in commercial project

    answered Jun 26 ’13 at 1:12

    This seems to be the most promising so far, You had me at IE7+. Trying it out now. Thanks Pierre Jun 28 ’14 at 10:53

    I ve tried many scripts/plugins and this one is the best. Easy to implement and style, plus all features. Great. Strabek Jun 26 ’15 at 12:55

    You can see a solved solution with a working demo here that allows you to preview and submit form files to the server. For your case, you need to use Ajax to facilitate the file upload to the server:

    The data being submitted is a formdata. On your jQuery, use a form submit function instead of a button click to submit the form file as shown below.

    answered Jul 19 ’16 at 5:18

    Sample: If you use jQuery, you can do easy to an upload file. This is a small and strong jQuery plugin, http://jquery.malsup.com/form/ .

    Example

    I hope it would be helpful

    answered Oct 14 ’16 at 7:17

    Sample from the link

    answered Jul 8 ’15 at 17:59

    This is my solution for mvc using ajax.

    answered Nov 3 ’15 at 20:47

    You seem to have mixed some kind of framework into your answer. You should, at the very least, mention which framework your answer is usable for. Better yet, remove all the framework stuff and present only an answer to the question posed. Zero3 Dec 19 ’15 at 18:06

    You re right; I used MVC5, Javascript, Html, Boostrap. Erick Langford Xenes Dec 21 ’15 at 15:49

    so there s actually a mvc framework called mvc ? and it uses csharpish syntax? that s cruel. nonchip Jan 6 ’16 at 22:27

    also your framework code actually does nothing valuable, why is it even there? nonchip Jan 6 ’16 at 22:28

    2017 Stack Exchange, Inc


    Categories: News Tags: Tags: , , , , ,

    Best Web Hosting for Big Business – Just for Websites #big #hosting #companies

    No Comments

    #

    Best Web Hosting for Big Business

    There are almost as many hosting providers available as there are web pages. So finding a good host for a larger website can be hard to do with all of the clutter.

    There are many factors that you should think about when choosing a host for your big business.

    • Number of Visitors
    • Support
    • Website Builders

    These are just a few factors. Lets dive into hosting, different providers and the features they can offer your big business.

    TYPES OF HOSTING

    First we need to understand the different types of hosting and how they vary from each other before we can determine what is the best web hosting for big business. Different types of hosting offer price differences in exchange for the quality of hosting. First there are five different types of hosting.

    • Shared Hosting
    • Virtual Private Server
    • Dedicated Hosting
    • Managed Hosting
    • Cloud Hosting

    These different types of hosting offer different prices for the type of hosting that your website needs. Lets explore what each type of hosting is and what it means for your business.

    • Shared Hosting is for smaller businesses, or individuals that need an inexpensive type of hosting. This type of hosting offers a low price in exchange for a less dedicated experience.
    • Virtual Private Servers are the next step up and unlike shared hosting only shares hosting with a few other websites. This offers better speed and response when your website gets increased visitors.
    • Dedicated Hosting is moving more into the hosting for big business world. Dedicated hosting gives you your own server to work with. This is a great option for a big business that plans on hiring a team to manage all aspects of their website.
    • Managed Hosting is hosting that is done for you. Basically you pay for your team through your hosting company. This lets you take it easy while your host does all of the hard work.
    • Cloud Hosting is a newer hosting solution. What cloud hosting does is put your website into the cloud, which offers different advantages and disadvantages. I am not too familiar with cloud hosting, but I did want you to know that it is out there and that more and more big businesses are making the move.

    BEST WEB HOSTING FOR BIG BUSINESS

    For the sake of big business however there are only three of these hosting options you should focus on.

    For the sake of argument I am going to assume that you are a big business and your website is already, or is going to receive a lot of visitors. This is why big business needs to focus on these three types of hosting.

    Lots of traffic means a heavy server load. This means that you probably need at least a dedicated server to handle all of your traffic and ensure that your website keeps going during high demand. This fact is why more and more business are moving to cloud hosting, so they can receive the benefits of massive storage and speed when there is a need. So which type of hosting is the best choice for big business?

    Dedicated Hosting

    Dedicated is a great choice for hosting if you have your own team and only need a server. Dedicated hosting gives you the tools for your team to, host, build and maintain your website. There is more work in it for your business, but you get the benefit of knowing what is happening with your website. A great hosting company with dedicated servers is VeeroTech. VeeroTech offers stellar 1on1 support and lightning fast dedicated servers .

    Cloud Hosting

    Cloud hosting is a great way to maintain savings while still offering the best speed and features for your business. Basically the big selling points of cloud hosting is you only pay for what you use, great security, flexibility and availability.

    Managed Hosting

    Managed hosting is a great choice for medium to big businesses and gives you a host that works for you. If you don t have that team of techies hired. Then managed hosting is the choice for you. With managed hosting you will receive a great bang for your buck.

    FIND A HOST

    There are so many hosts available on the internet, that I don t think I could even name them all this week. To cut through all the riff raff you need to find a hosting company that offers

    If you find a hosting company that does not offer these features, then just stay away. A good hosting company should offer to do things for you and should have compiled all of the best resources for you within your hosting account.

    We have compiled a great place to find website hosting for big or small business. There are three main companies that we want you to focus on for your hosting. WordPress Engine, Veerotech and Blue Host, these hosting providers offer the best of the features described above. To find them simply head to our website hosting page .

    Post navigation


    Categories: News Tags: Tags: , ,

    Best Online interactive Training on QA Testing, Selenium, Load Runner, ETL Testing, SQL DBA, MSBI

    No Comments

    #

    Online Training

    Online Training

    In this modern day, need for information & knowledge transcends physical boundaries & cultural barriers. Mind Q systems offers training in a format that allows trainees to experience familiar interaction of Class-room training by not being available in the same room. Virtual Class rooms where Trainer & Trainees assemble & interact in pursuit of knowledge & information.

    At an agreed time & date the trainer & the trainees get connected to web based Meeting Center (Webex). Through this meeting center, the trainer can share his presentations, present demo’s on the tools, share his desktop in allowing the participants to have a feel of the tools while simultaneously engaged in explanation of the concepts. Trainer’s voice & video can be clearly felt by the trainees. Trainees may raise their doubts, any point of time, and the trainer may respond by suitably clarifying them.

    This technology is tested and proven over a period of time & finds acceptance with thousands of organizations across the world & millions of people benefitting from it.

    Hardware / Software Requirements

    Any PC / Laptop with a sound card with headphone and microphone attached & connected to Internet (at least 64 Kbps)

    For Upcoming Batches Click here


    Data Lake vs Data Warehouse: Key Differences #big #data #warehouse

    No Comments

    #

    KDnuggets

    Data Lake vs Data Warehouse: Key Differences

    We hear lot about the data lakes these days, and many are arguing that a data lake is same as a data warehouse. But in reality, they are both optimized for different purposes, and the goal is to use each one for what they were designed to do.

    By Tamara Dull, (SAS).

    This is the 4th post in a 5-part series, “A Big Data Cheat Sheet: What Marketers Want to Know.” This spin-off series for marketers was inspired by a popular big data presentation I delivered to executives and senior management at the SAS Global Forum Executive Conference earlier this year.

    In this 5-part big data series, we’re taking a look at these five questions from the perspective of a marketer:

    • What can Hadoop do that my data warehouse can’t?
    • Why do we need Hadoop if we’re not doing big data?
    • Is Hadoop enterprise-ready?
    • Isn’t a data lake just the data warehouse revisited?
    • What are some of the pros and cons of a data lake?

    We’ve already tackled the first three questions (here , here. and here ), and we’re now on question 4. It’s time to talk about the data lake.

    Question 4: Isn’t a data lake just the data warehouse revisited?

    Some of us have been hearing more about the data lake, especially during the last six months. There are those that tell us the data lake is just a reincarnation of the data warehouse—in the spirit of “been there, done that.” Others have focused on how much better this “shiny, new” data lake is, while others are standing on the shoreline screaming, “Don’t go in! It’s not a lake—it’s a swamp!”

    All kidding aside, the commonality I see between the two is that they are both data storage repositories. That’s it. But I’m getting ahead of myself. Let’s first define data lake to make sure we’re all on the same page. James Dixon, the founder and CTO of Pentaho, has been credited with coming up with the term. This is how he describes a data lake:

    “If you think of a datamart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.”

    And earlier this year, my colleague, Anne Buff. and I participated in an online debate about the data lake. My rally cry was #GOdatalakeGO, while Anne insisted on #NOdatalakeNO. Here’s the definition we used during our debate:

    “A data lake is a storage repository that holds a vast amount of raw data in its native format, including structured, semi-structured, and unstructured data. The data structure and requirements are not defined until the data is needed.”

    The table below helps flesh out this definition. It also highlights a few of the key differences between a data warehouse and a data lake. This is, by no means, an exhaustive list, but it does get us past this “been there, done that” mentality:

    Let’s briefly take a look at each one:

    • Data. A data warehouse only stores data that has been modeled/structured, while a data lake is no respecter of data. It stores it all—structured, semi-structured, and unstructured. [See my big data is not new graphic. The data warehouse can only store the orange data, while the data lake can store all the orange and blue data.]
    • Processing. Before we can load data into a data warehouse, we first need to give it some shape and structure—i.e. we need to model it. That’s called schema-on-write. With a data lake, you just load in the raw data, as-is, and then when you’re ready to use the data, that’s when you give it shape and structure. That’s called schema-on-read. Two very different approaches.
    • Storage. One of the primary features of big data technologies like Hadoop is that the cost of storing data is relatively low as compared to the data warehouse. There are two key reasons for this: First, Hadoop is open source software, so the licensing and community support is free. And second, Hadoop is designed to be installed on low-cost commodity hardware.
    • Agility. A data warehouse is a highly-structured repository, by definition. It’s not technically hard to change the structure, but it can be very time-consuming given all the business processes that are tied to it. A data lake, on the other hand, lacks the structure of a data warehouse—which gives developers and data scientists the ability to easily configure and reconfigure their models, queries, and apps on-the-fly.
    • Security. Data warehouse technologies have been around for decades, while big data technologies (the underpinnings of a data lake) are relatively new. Thus, the ability to secure data in a data warehouse is much more mature than securing data in a data lake. It should be noted, however, that there’s a significant effort being placed on security right now in the big data industry. It’s not a question of if, but when.
    • Users. For a long time, the rally cry has been BI and analytics for everyone! We’ve built the data warehouse and invited “everyone” to come, but have they come? On average, 20-25% of them have. Is it the same cry for the data lake? Will we build the data lake and invite everyone to come? Not if you’re smart. Trust me, a data lake, at this point in its maturity, is best suited for the data scientists.

    Why this matters

    As a marketer, you may hear rumblings that your organization is setting up a data lake and/or your marketing data warehouse is a candidate to be migrated to this data lake. It’s important to recognize that while both the data warehouse and data lake are storage repositories, the data lake is not Data Warehouse 2.0 nor is it a replacement for the data warehouse.

    So to answer the question—Isn’t a data lake just the data warehouse revisited? —my take is no. A data lake is not a data warehouse. They are both optimized for different purposes, and the goal is to use each one for what they were designed to do. Or in other words, use the best tool for the job.

    This is not a new lesson. We’ve learned this one before. Now let’s do it.

    Bio: Tamara Dull. is a Director of Emerging Technologies, SAS Best Practices, a thought leadership organization at SAS Institute. While hot topics like 3D printing and self-driving cars keep her giddy, her current focus is on big data, privacy, and the Internet of Things – the hype, the reality and the journey.


    Categories: News Tags: Tags: , ,