Wikibon is a professional community solving technology and business problems through an open source sharing of free advisory knowledge. The hype surrounding Big Data, which showed no signs of abating in 2012, now has big dollars backing it up.
The total Big Data market reached $11.59 billion in 2012, ahead of Wikibon’s 2011 forecast. Increased investment in Big Data infrastructure by massive Web properties – most notable Google, Facebook, and Amazon – and government agencies for intelligence and counter-terrorism purposes.
In the enterprise space in particular, the combination of a better understanding of the use cases for Big Data and more mature product and service offerings resulted in a significant percentage of Big Data early adopters graduating from small, proof-of-concept projects to large-scale, production-level deployments.
The Big Data market is still within the confines of the early adopter phase and is poised for significant growth.
The well-publicized lack of analytic specialists and Data Scientists armed with both the technical skill and business acumen to derive insights from large, multi-structured data sets merged from disparate sources.
A lack of understanding among enterprises on how to organize Big Data staff to best identify business requirements for Big Data projects and effectively communicate insights gleaned from Big Data to the business.
Organizational resistance to adopting Big Data analytics-driven decision-making to replace “gut instinct”-style decision-making. Vendor marketing overly focused on “speeds-and-feeds,” product features and “Big Data-washing” rather than laying out a vision for Big Data in the enterprise, articulating a path to achieve this vision, and maximizing the potential for Big Data to disrupt well-established vertical markets.
Development of Big Data platforms and tools by vendors that eschew open frameworks in favor of closed, locked-down solutions.
A lack of best practices and related technologies for managing Big Data as a corporate asset, including data quality, data governance, and security platforms and tools. A dearth of Big Data application development tools and services that allow existing developers to build and customize Big Data applications using common and popular application development languages and processes. Regarding methodology, the Big Data market size, forecast, and related market-share data was determined based on extensive research of public revenue figures, media reports, interviews with vendors, venture capitalists and resellers regarding customer pipelines, product roadmaps, and feedback from the Wikibon community of IT practitioners. Many vendors were not able or willing to provide exact figures regarding their Big Data revenue, and because many of the vendors are privately held, Wikibon had to triangulate many types of information to determine our final figures. Information types used to estimate revenue of private Big Data vendors included supply-side data collection, number of employees, number of customers, size of average customer engagement, amount of venture capital raised, and age of vendor. It is critically important to understand how Wikibon defines Big Data as it relates to the market size overall and to revenue estimates for specific vendors in particular. First, from a technology perspective, Wikibon defines Big Data as those data sets whose size, type, and speed-of-creation make them impractical to process and analyze with traditional database technologies and related tools in a cost- or time-effective way.
Second, Wikibon believes Big Data requires practitioners to embrace an exploratory and experimental mindset regarding data and analytics, one that replaces gut instinct with data-driven decision-making, and exchanges stubbornness for a willingness to question long-held assumptions. Market-leader IBM offers by far the largest product and services portfolio by both breadth and depth. Amazon continued and Google kicked off increasingly aggressive moves into the Big Data market. While M&A activity was relatively tepid, two important acquisitions took place in 2012 that have the potential to impact the long-term Big Data market. Microsoft officially entered the Hadoop market in 2012 with the release of an on-premise Hadoop product - HDInsight Server for Windows – and a cloud-based Hadoop service - Windows Azure HDInsight Service. A movement to bring SQL and NoSQL together in a unified platform was firmly established in 2012. Facebook, Google, and Amazon as well several three-letter government agencies continued to invest heavily in commodity hardware to build out massive internal Big Data infrastructures.
As mentioned in the introduction of this report, Hadoop-related software and services matured rapidly in 2012, leading to increased adoption of enterprise-level products by companies in industries beyond the Web. As a result, leading Hadoop distribution vendors Cloudera and MapR enjoyed significant revenue growth last year.
Likewise, in the related NoSQL space a handful of vendors that offer commercial versions of popular open source databases enjoyed significant revenue growth as pilot projects blossomed into production deployments supporting real-time, Web-scale applications and services. Below is a cut out of Big Data revenue associated with those vendors specializing in Hadoop and NoSQL software and services. Wikibon further expects the balance of revenue generation and value to shift from Big Data infrastructure and middleware to value-add services and software over the next five years. Wikibon believes Big Data infrastructure, middleware, and technical services will become increasingly commoditized as they mature and common standards are adopted. Wikibon will be looking in more detail at the components that make up the Big Data market, shown in Figure 4. Action Item: While Big Data vendor revenue is forecast to grow significantly over the next five years, Wikibon believes that Big Data practitioners will create much more value than technology and service providers in the long-term. The chart in the section "Big Data Revenue by Market Segment" should label the units as Millions instead of Billions. Based on what I know about the data science services industry, your estimates of % of big data revenue for Opera Solutions, Fractal Analytics and Mu Sigma are off significantly.
I came across this study thanks to a 5 pages article in Le Monde that cited Wikibon as one of their source for market analysis. I notice in the revenue table that the aggregated vendor total revenue (1,223,425) is off from what it is shown on the page (1,244,602).
Jeff, came across this again this week as it was posted on LinkedIn and it got me thinking.
Thanks to QTS’s intuitive user interface, system operations of the TS-EC1280U R2 are smooth and handy. Quickly and easily find documents, photos, videos, and music with Qsirch, QNAP's new built-in NAS search application lets you find what you need by real-time, natural search. The TS-EC1280U R2 supports two on-board mSATA internal cache ports, and up to 4 SSDs can be used for caching.
The TS-EC1280U R2 supports online capacity expansion by cascading multiple QNAP RAID expansion enclosures to meet the needs of growing business data. QNAP TS-431 Personal Cloud Diskless 4-Bay NAS:The TS-431 is a powerful yet easy-to-use network storage center for backup, synchronization, remote access, and home entertainment.
The TS-431 supports multiple port-trunking modes, enabling IT administrators to set up link aggregation to increase total data transmission capability.
Thanks to QTS intuitive user interface, system operations of the TS-431 are smooth and handy.
By connecting external drives to the Turbo NAS via eSATA or USB, you can easily configure and copy shared folders on the TS-431 to external devices. The File Station brings conventional desktop-style file operations to web browsers, allowing you to upload, download, and manage files on your TS-431 wherever there is an internet connection.
In contrast to the limited storage space and potential security concerns of public cloud services, the TS-431 is advantageous for establishing a secure and large-capacity private cloud. With flexible privilege settings for editing & sharing notes, you can build a secure environment to work together with friends and colleagues. In an email sent to subscribers and in a notice posted on their website, the company cited a fast moving cloud industry as the reason for the closure. Cloud storage is cheaper, expands endlessly and needs little attention; but how much data can a company realistically park in the cloud? Moving data to the cloud is cheaper, can be expanded endlessly and needs little attention; but how much data can a company realistically park in cloud storage?
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners.
You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy. Apparently, he was sold on an analyst's view that the cloud was the future of computing and anyone storing data in their own data centers in the years to come would be considered a Luddite. It's a popular belief that moving data to the cloud not only takes advantage of the low prices cloud storage vendors can achieve due to their economies of scale, but it can also free a company from the drudgery of buying, commissioning, provisioning and maintaining storage systems.
Any move to the cloud must first consider the impact such a move would have on your users and their applications. Still, if the data existed in the cloud, local caches and copies of data can be retained and used as needed. But you will need to conduct a careful assessment of each of your data types, and the applications that access them, to see what options are available for moving data to the cloud and how each of those options would affect the user experience, data protection processes, bandwidth required at each location that needs to access the data and, of course, cost. The easiest data to move to the cloud is the data belonging to apps that could be replaced with a Software-as-a-Service (SaaS) application.
Moving to a hosted exchange product from Microsoft or any of its partners (such as Intermedia or Rackspace) would ensure the user experience remains the same as it was with the in-house Exchange implementation.


Similarly, it's often advantageous to move a current customer relationship management (CRM) system and put the whole application in the cloud. In many organizations, the bulk of installed storage capacity is consumed by file systems of one type or another. There are a number of available solutions, including getting file access via a hosted SharePoint, but that approach may require users to change their work habits. Consumer-grade file sync-and-share products such as Dropbox or SugarSync provide an obvious solution to users who need to access their files from multiple devices and locations. Corporate sync-and-share solutions such as EMC Syncplicity, Egnyte and Soonr, as well as offerings from Box and Dropbox, address enterprise management issues by integrating with the corporate active directory, imposing access control lists and maintaining an audit trail of who has accessed which file and when. The best way to get files into the cloud, while still making them accessible to users in other locations, is to install an integrated cloud storage appliance in each location. These gateways map the files into the block storage paradigm used by the cloud providers and, in many cases, also deduplicate the data so, for example, a sales presentation that's in dozens of users' home directories is only stored once. Most of these solutions also use the cloud to store an essentially unlimited number of snapshots. The best part about these offerings is that all corporate data can be accessed through any gateway in any location. Assuming you don't have the sort of huge Oracle databases that require big RISC servers and all-flash arrays, there's a cloud-based solution for database applications. The other advantage to pinning is that the applications can still run even if the Internet connection between the remote site and the cloud storage provider is down. For data protection, the gateways will take a periodic snapshot of the database volume and send the snapshot to the cloud provider.
While moving all of a company's data into cloud storage services may be somewhat impractical at this time, there are tools available today to move at least a copy of all your data to the cloud. Infrascale backup converges a guaranteed 15-minute recovery of fully running systems in a customer's preferred cloud.
Your enterprise data protection family tree must incorporate both data availability and data management branches. Factory revenue generated by the sale of Big Data-related hardware, software and services took a major step forward in 2012, growing by 59% over 2011(a). This evolution naturally required increased investment in Big Data hardware, software, and services. For the Big Data market to reach its full potential, enterprises and vendors must overcome several obstacles. This will limit interoperability with competing and complimentary products and reduce customer choice. This list includes both Big Data pure-plays – those vendors that derive close to if not all their revenue from the sale of Big Data products and services – and vendors for whom Big Data sales is just one of multiple revenue streams.
We also held extensive discussions with former employees of Big Data companies to further calibrate our models. Projects whose processes are informed by this mindset meet Wikibon’s definition of Big Data, even in cases where some of the tools and technology involved may not. The company also supports its Big Data practice with a well-crafted, high-level marketing campaign focused around its Smarter Planet initiative that often includes illustrations of real-world Big Data deployments.
It did so mostly thanks to revenue derived from Big Data-related services, followed by sales of hardware to support Big Data deployments. Each introduced new products and services to allow enterprises to leverage Big Data analytics and storage-as-a-service with the usual benefits associated with public Cloud services (elasticity, pay-by-the-drink, trading upfront CAPEX for monthly OPEX, etc.) Specifically, Amazon introduced RedShift, an analytic-database-as-a-service, to its portfolio and struck a deal with MapR to allow customers to run its Hadoop distribution on Amazon Web Service, among other announcements. WANdisco specializes in data replication across the WAN, which it applies to Hadoop (both its own distribution as well as Cloudera’s and Hortonworks’ distributions) with the aim of making the open source Big Data framework reliable enough to support mission critical applications. Hadapt and Teradata Aster, which kicked off this movement in 2011 continued to lead the charge but were joined by competitors Cloudera, Microsoft and others in 2012. Facebook alone spent close to $800 million on infrastructure in just three quarters in 2012. In many cases, companies that had previously deployed community (read: free) versions of vendor Big Data software bundles for proof-of-concept projects began upgrading to paid software and services to support production-level deployments.
The company’s NoSQL document store is in use at Bank Of America, the Defense Intelligence Agency and Warner Brothers, among other household names in the media and financial services industries. Note that these vendors account for total Big Data revenue of $272 million and are growing at a faster percentage rate than the rest of the Big Data market. Looking beyond 2013, Wikibon forecasts the total Big Data market to approach $50 billion by 2017, which translates to a 31% compound annual growth rate over the five-year period 2012-2017. As noted, hardware revenue accounts for 37% of Big Data revenue and a large portion of software and services revenue is associated with infrastructure software and technical services that tie Big Data platforms and data together.
Practitioners will increasingly look to NoSQL and in-memory database software, streaming analytic platforms, vertically focused analytical and transactional applications and application development platforms (both on-premise and Cloud-based) and associated consulting and professional services to address specific, high-value business problems and opportunities. When selecting vendors to support Big Data initiatives, therefore, CIOs and Big Data practitioners must evaluate the products and services on offer in the context of how best to monetize Big Data to achieve competitive advantage. Upon further review and extensive feedback from the Wikibon community, it was decided that the original figure underestimated the level of revenue generated by original device manufacturers. We are close to $8 Million with all revenue coming from open source and increasing part from big data. The report from Wikibon is based on primary research with specific data by vendor that adds up to the market total in a "transparent" way. I do believe the India pureplays have much to gain and will be investing in differentiating themselves with Big Data Analytics & Cloud. It is yet another reason, and even more proof, that WikiBon is such a virtuous organization. Its intelligent desktop allows you to find the desired functions quickly, create desktop shortcuts or group shortcuts, monitor important system information on a real-time basis, and open multiple application windows to run multiple tasks concurrently, bringing greater working efficiency. Featuring with Intel® Xeon® E3 v3 Quad-Core Processor and 4 GB DDR3 ECC RAM, the TS-EC1280U R2 greatly enhances efficiency of CPU-consuming tasks and serves more concurrent tasks at the same time. By installing mSATA flash modules (optional purchase), the IOPS performance of storage volumes can remarkably boost. The motherboard itself can be removed and replaced without needing to remove the heavy TS-EC1280U R2 from the rack cabinet, making it especially useful in maintaining the TS-EC1280U R2 in data centers with multiple rack cabinets. With the Notes Station, you can easily create digital notes on your private cloud and share with friends and colleagues at no extra cost. Its intelligent desktop allows you to find desired functions quickly, create desktop shortcuts or group shortcuts, monitor important system information on a real-time basis, and open multiple application windows to run multiple tasks concurrently, bringing greater operating efficiency. The front panel USB port features a one-touch-copy function for you to instantly back up data on external drives to the TS-431 or alternatively to back up data from the TS-431 to external drives.
Data, logs and ISO image of CDs and DVDs can be centrally stored in the TS-431 and protected by an integrated antivirus solution. Supporting file extraction, folder creation, and smart search for files and folders, File Station also allows you to easily share files with friends & family via unique URLs. The myQNAPcloud service allows remote access to files stored on the TS-431, allowing you to play & share multimedia from the TS-431 or to conveniently check the system status from anywhere. With just a few steps you can create notes from web pages (or by importing from the TS-431), edit notes with the intuitive web-based editor, and take screenshots for quick note-taking with Qsnap or the Notes Station Clipper extension for Chrome. In addition, the Notes Station enables you to create to-do lists & calendar alerts that can be updated to your Google Calendar, helping manage your daily tasks more efficiently.
The Photo Station makes photo sharing simple, and you can designate your own access right controls.
All consumer plans will be discontinued, in order to focus attention on their enterprise services. Refunds will be given for any customers that have paid for service beyond the May 31 deadline. They were the only cloud storage service to use convergent encryption after Bitcasa Drive was discontinued. Depending on your specific needs, the size of your environment, and your budget, ita€™s essential to weigh all cloud and on-prem options. The CIO just got back from his favorite analyst firm's annual cloud conference and golf tournament and promptly decreed the company was going to move all its data, not just the archives, to public cloud storage. In addition, if cloud storage is endlessly elastic, it can be used without the kind of careful capacity planning in-house storage requires.
In many cases, it may be feasible to move much of your data to the cloud, but many key applications are likely to require their data to be kept in-house. You may see a small increase in network traffic as users retrieve their data from the cloud server, but the cost of running Exchange would be limited to the $10 or so charge per user the provider has for the service. Switching to a Web-based CRM app will make life easier for sales teams and other road warriors vs.


These files are stored on dedicated network-attached storage (NAS) appliances and on a few Windows file servers at remote sites.
But some of these commercial services lack the kind of data access controls, security and scalability that enterprises require.
While this is a big improvement, and these offerings can be very useful for sharing data with highly mobile users that need their data on the road and with people outside the organization, most organizations are unlikely to give up traditional file services in favor of file sync and share. Between these snapshots and the cloud storage provider replicating the data to multiple locations, traditional backups could become a thing of the past.
While gateway products vary as to how well they handle multiple users modifying the same file at the same time, they all provide a better shared file-access model for multiple locations than traditional NAS appliances. You could run Microsoft SQL Server or Oracle in an Amazon Elastic Compute Cloud (EC2) instance accessing Amazon's Elastic Block Store (EBS), but adding 20ms to 200ms of latency between the application running on a user's PC and the database server will likely have a negative impact on performance and user experience. For databases running on virtual Windows or Linux machines, file storage can be provided by cloud storage gateways.
Pinning the entire volume holding the database to the gateway's local storage, rather than using it as a cache, offers two significant advantages.
While most locations have, or could be equipped with reliable, high-bandwidth connections to the Internet, good connectivity for field offices and remote locations may be too expensive or simply unavailable. In the event of a disaster, a virtual version of the gateway can be spun up in EC2 or another cloud compute provider and the apps can be mounted from the snapshot. Some applications, like email, can be shifted to the cloud fairly easily, while other applications can be replaced with cloud-based apps that offer equivalent functionality.
If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Feedback from the Wikibon community included multiple reports of $100 million+ deals from both government and commercial buyers. While a detailed discussion of these obstacles is outside the purview of this report, they are worth noting. The biggest criticism of IBM from practitioners is that the company’s portfolio is so wide and deep it causes confusion. HP by its sheer size is in a position to impact and participate in a number of Big Data deployments. Consolidated across vendors, professional and cloud services revenue accounted for $5 billion of total 2012 Big Data revenue.
VMware had already begun efforts to apply virtualization technology to Hadoop, and the acquisition of CETAS gives the vendor a more comprehensive Big Data portfolio.
Microsoft also announced PolyBase, which aims to allow the SQL Server Parallel Data Warehouse to execute SQL queries against data stored in Hadoop. This spending is reflected in Big Data revenue for the original device manufacturer (ODM) category that appears at the bottom of the table. While the global economic outlook is for slow to stagnant growth over this period, Wikibon believes the Big Data market will not be severely impacted and may, in fact, benefit from enterprises needing “to do more with less,” which effective Big Data analytics facilitates.
This includes evaluating “speeds and feeds” and other product features but should also include evaluating how well vendors can assist enterprises in adopting a sustainable culture of data-driven decision-making.
I don't see any reference to market share data in the Transparent Research report description so it's not clear where there baseline comes from. They have doubled their revenue & improved market share in last 6 years and are giving stiff competition to global MNCs.
This is exactly the kind of info I need for context and I know many in large enterprises need this too. In which category of your market forecast would you place these data collection services - perhaps SaaS, app software, professional services, or a mix? Qsirch even remembers your search history so you can quickly navigate to files you've previously searched for.
It is perfect for improving overall workflow of random IOPS demanding applications such as database and virtualization. Coupled with the support for large storage capacity, the TS-EC1280U R2 is perfect for data centers to storage big data and excel in fast data transmission, and for editing large size videos on-the-fly without transferring the videos between the TS-EC1280U R2 and desktop computers. It is especially useful for large data applications, such as video surveillance, data archiving, and TV broadcast storage, to name few. Powered by an advanced ARM® Cortex® A9 dual-core processor, the TS-431 delivers incredible performance as a home NAS with three USB 3.0, one eSATA, and dual LAN ports.
You can also use the TS-431 as a home multimedia hub to enjoy your photo, music and videos while downloading or backing up files without impacting the TS-431’s performance. QNAP’s NetBak Replicator supports real-time and scheduled data backup for Windows (including Outlook email archiving) and Mac OS X users can use Time Machine to effortlessly back up data to the TS-431.
With reliable security features, SSL and password options, managing & sharing your files is simple and safe with the File Station. You can also easily leverage the rich media contents stored on the TS-431 to enhance your notes, and access them on your mobile phones or tablet devices by using Qnotes mobile app anytime, anywhere. Simply upload your photos to the TS-431 to create & organize your photo albums through a web browser, and share with your family and friends through social networking sites such as Facebook, Google+, Twitter and Pinterest. Services that don’t offer a unique selling point are going to find it hard to compete. Download this comprehensive guide in which experts analyze and evaluate each cloud storage option available today so you can decide which cloud model a€“ public, private, or hybrid a€“ is right for you.
And if that's not convincing enough, consider that cloud storage providers say they protect our data by replicating it to multiple data centers. For example, an internal Exchange infrastructure may have its own reliability issues and users might be spending too much time pruning their email to meet mailbox quotas when they could be doing more constructive things. While simply moving all the files to Amazon Simple Storage Service (S3) or another cloud storage service would effectively put that data in the cloud, S3 and similar services present their data through an object storage interface that would essentially make it inaccessible to users.
The best of these offerings maintain basically the same user experience, including performance, as existing NAS systems, and provide users with easier and broader access to shared files from any location. Also, with many users syncing all the files in the folders they're subscribed to, bandwidth requirements could become an issue.
The biggest difference is that instead of using SMB, iSCSI will be used to access the gateway resources. IBM combats this confusion by initiating many Big Data customer engagements through its professional services division. Google finally got into the Big Data game by productizing Big Data tools and technologies, such as BigQuery, it has long used internally, and likewise introduced MapR as a service via Google Compute Engine. The creation of the Pivotal Initiative further indicates that VMware and EMC are continuing to invest in Big Data for the long-term. Specifically, Facebook and others like it purchase, configure, and deploy off-the-shelf hardware from ODM’s such as Quanta, rather then purchasing commodity machines from vendors such as Dell or HP, to support the majority of their operations. Probably because we are mainly a European vendor, but we start to have some customer in the US too. Qsirch is a huge productivity boost – it greatly reduces the amount of time spent looking for files on the NAS so you can focus on other tasks. The internal cache port design reserves the space of hard drive tray for more storage capacity. A wide range of third-party backup software such as Acronis® True Image and Symantec® Backup Exec are also supported. With the Qfile mobile app (for iOS® and Android™) you can also instantly upload your new photos from your mobile device to the TS-431. Moving to a hosted Exchange offering from Rackspace or Intermedia, for example, would be relatively painless; users would get 10 GB or 100 GB mailboxes with additional archive space, and you may be able to avoid a costly and disruptive upgrade from Exchange 2010 to Exchange 2013. A cache miss on a cloud gateway will add 20ms to 200ms of latency, and just a few cache misses will cause a big drop in performance. For other applications, especially those that might suffer from the latency that cloud storage incurs, a hybrid approach where data is stored locally for performance, but eventually ends up in the cloud, may be best. A challenge and area of focus for IBM moving forward is to continue to articulate its Big Data vision in a way that focuses on industry solutions and not point products.
Services that make data sets available to enterprises and other service providers are a key enabler to the Data Economy.




Cloudfront price drop
Qnap owncloud 6 update query
Supplement affiliate programs uk quito


Comments

  1. 12.06.2015 at 23:32:46


    Been more important, and yet the world leader in information.

    Author: SmashGirl
  2. 12.06.2015 at 19:39:12


    Backup They were the first areas have.

    Author: iko_Silent_Life
  3. 12.06.2015 at 22:51:52


    Feature can still be used as a local clipboard to save.

    Author: YARALI_OGLAN
  4. 12.06.2015 at 16:29:23


    Way biased, and our fantastic advances in design, engineering and services.

    Author: 0503610100
  5. 12.06.2015 at 19:39:14


    This, built its solution run and will provide better quality of service.

    Author: salam