Northbrook, IL - Customer Success - Northbrook, IL - Internship
This Analytics Intern will be supporting one of the nation?s largest drug chains customized big data solutions and help drive RSi? analytical reporting and business intelligence application adoption and deliver
There is an urgent need for an extreme transformation of the customer relationship. Customers live in a world of self service, big data, customer automation and the integration of the online & offline world. If your organization fails to implement the digital relation, your future becomes very uncertain. Succeeding in the digital transformation will not be enough. As a consequence of the digital evolution, there is also a need for the human transformation of your customer relationship. Thinking about the role of humans versus machines, thinking about the role of the warm human touch and considering the power to connect people with people, are the key challenges in this domain. âWhen digital becomes humanâ is a story about the combination of the digital and the human transformation in your customer strategy. This story will take you on a journey to the future. It is provocative, exciting and scary. Enjoy this amazing view on the future of marketing!
This paper is an adaptation of a longer report commissioned by the UK Data Service. The longer report contributes to on-going support for the Big Data Network â a programme funded by the Economic and Social Research Council (ESRC). The longer report can be found atdoi:10.7207/twr16-02.
This paper discusses requirements for preserving transactional data and the accompanying challenges facing the companies and institutions who aim to re-use these data for analysis or research. It presents a range of use cases â examples of transactional data â in order to describe the characteristics and difficulties of these âbigâ data for long-term access. Based on the overarching trends discerned in these use cases, the paper will define the challenges facing the preservation of these data early in the curation lifecycle. It will point to potential solutions within current legal and ethical frameworks, but will focus on positioning the problem of re-using these data from a preservation perspective.
In some contexts, these data could be fiscal in nature, deriving from business âtransactionsâ. This paper, however, considers transactional data more broadly, addressing any data generated through interactions with a database system. Administrative data, for instance, is one important form of transactional data collected primarily for operational purposes, not for research. Examples of administrative data include information collected by government departments and other organisations when delivering a service (e.g. tax, health, or education) and can entail significant legal and ethical challenges for re-use. Transactional data, whether created by interactions between government database systems and citizens or by automatic sensors or machines, hold potential for future developments in academic research and consumer analytics. Re-use of reliable transactional data in research has the power to improve services and investments by organisations in many different sectors. Ultimately, however, these data will only lead to new discoveries and insights if they are effectively curated and preserved to ensure appropriate reproducibility. This paper explores challenges to this undertaking and approaches to ensuring long-term access.
President Obama delivered a speech at the Department of Justice to announce the outcomes of a broad-ranging and unprecedented review of U.S. intelligence programs. [Also read 1) The Fight Against Big, Bad Data 2) Big Data and the Future of Privacy] The review examined how, in light of new and changing technologies, we can use […]
âInformation has dramatically increased its role in society in recent years. Big data, mass data, very large data, raw data, open data, data analytics, digitisation...,â Trafiâs Information Director and Director General for Data Resources, Mia Nykopp, lists the various types of information that affect our lives. Trafi is actively involved in the 10th ITS European Congress in Helsinki, held on 16â19 June, 2014.
We seek a biologist who has expertise in analysis of big data, modeling, bioinformatics, genomics/transcriptomics, biostatistics, or other quantitative and/or... From University of Richmond - Thu, 06 Jul 2017 23:17:18 GMT - View all Richmond, VA jobs
A consequence of the Moore Nielsen prediction is the phenomenon known as Data Gravity: big data is hard to move around, much easier for the smaller applications to come to it. Consider this: it took mankind over 2000 years to produce 2 Exabytes (2×1018 bytes) of data until 2012; now we produce this much in […]
My 2014 predictions are finally complete. Â If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular. Â As an eternal optimist, I believe 2014 will be even better: Big dataâs biggest play will be in meatspace, not cyberspace. Â There is just so much data we produce and give away, great opportunity for […]
I went to the morning session of Hello Culture, a one-day conference discussing ‘big data’ in the context of arts and culture. I was on a panel called ‘Data â Is the Tail Wagging the Dog?’ I was given a few minutes to talk to the theme and so I put some slides together and […]
Imagine living in the shadow of an active volcano. That's the reality for thousands of residents near Mexico City, as Popocatepetl looms over their city. Scientists from the USGS, UNAM and CENAPRED are using Big Data to monitor and analyze input from hundreds of live sensors in an effort to keep people safe and ready to evacuate in the event of an eruption.
Located in North Carolina's Pisgah National Forest, PARI is a non-profit center for astronomical research and education using the power of Big Data to show us the universe as we've never seen it before. PARI is on a mission to digitize these two hundred thousand donated star plates into a massive online database - so they can be accessed and analyzed by researchers around the world.
"Like" us on Facebook: on.fb.me/N2vD24 Follow us on Twitter: http://bit.ly/Ukp7aB
How large is the universe? How did the universe begin? Could there be intelligent life out there? The answers to these questions are right in front of us...and it's being stored on EMC donated storage.
Astronomers have been recording the night sky since the mid 1800's on photographic glass plates known as star plates. These fragile plates have been hidden away in basement archives for generations...that is until now.
Located in North Carolina's Pisgah National Forest, PARI is a non-profit center for astronomical research and education using the power of Big Data to show us the universe as we've never seen it before. PARI is on a mission to digitize these two hundred thousand donated star plates into a massive online database - so they can be accessed and analyzed by researchers around the world.
How can Trust in action help impact the lives of millions? EMC's John Custer, former U.S. Major General revisits the Horn of Africa to explore how the power of Big Data is transforming the future for generations to come.
Big Data: Embracing Data to Transform Healthcare and Pharma Commercial Strategy - Featuring Expert Panel Views from Industry Survey 2016"" provides a comprehensive analysis of the Big Data landscape. GBI Research conducted an extensive industry survey of 73 experts from the pharmaceutical and healthcare industries.
Pune, Maharashtra -- (SBWIRE) -- 02/09/2017 -- Big Data Market Embracing Data to Transform Healthcare and Pharma Commercial Strategy - Featuring Expert Panel Views from Industry Survey 2016"" provides a comprehensive analysis of the Big Data Market landscape.Â Report conducted an extensive industry survey of 73 experts from the pharmaceutical and healthcare industries - including both organizations that already utilize Big Data Market and those that do not. Our survey gathered experience and opinion on the use of Big Data Market, and insights on key trends for the present and future use of the technology within healthcare.
Big Data Market refers to any data set that is too large to store, process or analyze using traditional database software and hardware. It can have a significant impact on all aspects of the pharmaceutical and healthcare sector, and companies are making large investments to leverage the technology more effectively.
Browse more detail information about Big Data Market
The report features an overview of Big Data Market and its place within healthcare. It examines the factors driving and necessitating the use of the technology within this industry, and provides detailed examples of how different Big Data Market sources and analytics techniques could be used to provide direct benefits to pharmaceutical companies, healthcare institutions and patients.
Big Data Market Scope:
- What is Big Data Market? What is its place within healthcare, and what are the main data sources?
- How prevalent is the use of Big Data Market in healthcare?
- What are the main driving factors necessitating the use of Big Data Market in healthcare? What is the relative importance of these factors according to industry?
- What are examples of the commercial benefits that the use of Big Data Market and analytics can provide, in different aspects of the industry?
- What are the main challenges associated with Big Data Market in healthcare? What is the relative importance of these factors according to industry? For the organizations that do not yet utilize Big Data Market, what specific reasons have led to their decision not to do so?
- How do major pharmaceutical and healthcare companies use Big Data Market in the real world? What are some of the main partnerships between Big Pharma and technology companies? What is the underlying technical architecture of Big Data Market in healthcare?
- What is the likelihood that organizations that already use Big Data Market will increase their investment within the next five years? Will those that do not currently invest in the technology begin doing so in the next five years?
- How can Big Data Market be effectively implemented within an organization?
Healthcare report will allow clients to have an understanding about market opportunities and competitive analysis and forecast on the women's healthcare industry. Interested clients will get a view on how therapies are developing for changing conditions and all the key factors that play together to affect or improve women's health.
Detailed TOC of Big Data Market - Assessing the Need for a Targeted and Specialized Approach
1 Big Data Market Overview 9
- What is Big Data Market? 9
- The 'Three Vs' of Big Data Market: Volume, Velocity and Variety 9
- The Sources of Big Data Market in Healthcare 10
- Big Data Market Lifecycle 12
- How Prevalent is the use Big Data Market in Healthcare? Results from our Industry-Wide Survey 13
2 Drivers of Big Data Market in Healthcare 17
- Advances in Technology: Explosion in Data Generation 17
- Next-Generation Sequencing Technologies: Outpacing Moore's Law 17
- Proteomic Databases: ProteomicsDB Designed with Big Data Market Analytics in Mind 18
- Electronic Health Records: A Form of Big Data Market 19
- Social Media: Information That Cannot Be Found Anywhere Else 19
- Devices: Smartphones, Wearables and Telemedicine Devices Represent a Continuous Source of Big Data Market 20
- Cloud Technologies: Often Integral to Big Data Market 20
- Needs and Trends Driving the Use of Big Data Market in Healthcare 21
3 Commercial Implications of Big Data Market in Healthcare 27
- Predictive Modeling: Fundamental Source of Big Data Market's Power 27
- Using Big Data Market for Patient-Specific Modeling: Potential for Huge Healthcare Savings 28
- Big Data Market Unlocks the Potential of Personalized Medicine and Targeted Therapies 28
- Utilizing the Unique Big Data Market Provided by Wearables and Fitness Trackers 29
- Big Data Market for a More Systemic Approach to Drug Repositioning 29
- Drug Discovery and Pre-Clinical Trials: Big-Data-Guided Drug Development 29
4 Appendix 63
- GBI Industry Survey: Breakdown of Respondents by General Industry 63
- GBI Industry Survey: Breakdown of Respondents by Specific Sector 63
- GBI Industry Survey: Breakdown of Respondents by Region 63
- GBI Industry Survey: Proportion of Healthcare Organizations that Currently Utilize Big Data Market 64
- GBI Industry Survey: Big Data Market Utilization in Healthcare, Comparison of Expert Panels from Europe, North America and Asia 64
- GBI Industry Survey: Most Important Factors Promoting the Use of Big Data Market in Healthcare 65
- GBI Industry Survey: Most Important Factors Promoting Big Data Market, Pharmaceutical Expert Panel vs Overall Healthcare Expert Panel 65
- GBI Industry Survey: Most Important Factors Promoting Big Data Market, Regional Breakdown 66
About Absolute Report
Absolute Reports is an upscale platform to help key personnel in the business world in strategizing and taking visionary decisions based on facts and figures derived from in-depth market research. We are one of the top report resellers in the market dedicated towards bringing you an ingenious concoction of data parameters.
Global Big Data Infrastructure Market 2016-2020, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years. The report also includes a discussion of the key vendors operating in this market.
Pune, Maharashtra -- (SBWIRE) -- 02/09/2017 -- The Global Big Data Infrastructure Market Research Report covers the present scenario and the growth prospects of the Global Big Data Infrastructure Industry for 2017-2021. Global Big Data Infrastructure Market, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years and discussion of the key vendors effective in this market.
Big data refers to a wide range of hardware, software, and services required for processing and analyzing enterprise data that is too large for traditional data processing tools to manage. In this report, we have included big data infrastructure, which includes mainly hardware and embedded software. These data are generated from various sources such as mobile devices, digital repositories, and enterprise applications, and their size ranges from terabytes to exabytes. Big data solutions have a wide range of applications such as analysis of conversations in social networking websites, fraud management in the financial services sector, and disease diagnosis in the healthcare sector.
Report analysts forecast the Global Big Data Infrastructure Warming Devices market to grow at a CAGR of 33.15% during the period 2017-2021.
The Global Big Data Infrastructure Market Report is a meticulous investigation of current scenario of the global market, which covers several market dynamics. The Global Big Data Infrastructure market research report is a resource, which provides current as well as upcoming technical and financial details of the industry to 2021.
To calculate the market size, the report considers the revenue generated from the sales of Global Big Data Infrastructure globally.
Key Vendors of Global Big Data Infrastructure Market:
Other prominent vendors
Global Big Data Infrastructure market report provides key statistics on the market status of the Global Big Data Infrastructure manufacturers and is a valuable source of guidance and direction for companies and individuals interested in the Global Big Data Infrastructure industry.
Global Big Data Infrastructure Driver:
- Benefits associated with big data
- For a full, detailed list, view our report
Global Big Data Infrastructure Challenge:
- Complexity in transformation of procured data to useful data
- For a full, detailed list, view our report
Global Big Data Infrastructure Trend:
- Increasing presence of open source big data technology platforms
- For a full, detailed list, view our report
Geographical Segmentation of Global Big Data Infrastructure Market:
Â· Global Big Data Infrastructure in Americas
Â· Global Big Data Infrastructure in APAC
Â· Global Big Data Infrastructure in EMEA
The Global Big Data Infrastructure report also presents the vendor landscape and a corresponding detailed analysis of the major vendors operating in the market. Global Big Data Infrastructure report analyses the market potential for each geographical region based on the growth rate, macroeconomic parameters, consumer buying patterns, and market demand and supply scenarios.
Key questions answered in Global Big Data Infrastructure market report:
- What are the key trends in Global Big Data Infrastructure market?
- What are the Growth Restraints of this market?
- What will the market size & growth be in 2020?
- Who are the key manufacturer in this market space?
- What are the Global Big Data Infrastructure market opportunities, market risk and market overview?
- How revenue of this Global Big Data Infrastructure market in previous & next coming years?
The report then estimates 2017-2021 market development trends of Global Big Data Infrastructure market. Analysis of upstream raw materials, downstream demand, and current market dynamics is also carried out. In the end, the report makes some important proposals for a new project of Global Big Data Infrastructure market before evaluating its feasibility.
About Absolute Report:
Absolute Reports is an upscale platform to help key personnel in the business world in strategizing and taking visionary decisions based on facts and figures derived from in depth market research. We are one of the top report resellers in the market, dedicated towards bringing you an ingenious concoction of data parameters.
In five years you’ll be using Insight PaaS for big data in the public cloud. On-premise won’t be an option. Here is why. Cloud Is The Hottest Market For Big Data Technology The shift to the cloud for big data is on. In fact, global spending on big data solutions via cloud subscriptions will grow […]
Tony Frazier, senior vice president ofÂ government solutions atÂ DigitalGlobe, has saidÂ the U.S. government acknowledges the need to leverage big data analytics, machine learning, automation and other commercial technology platforms in order to help transform the intelligence community and global mapping efforts. Frazier wrote in a blog postÂ published Friday that government leaders such as Robert Cardillo, director […]
DigitalGlobe andÂ MacDonald, Dettwiler and Associates have entered an agreement to makeÂ data from MDA’sÂ RADARSAT-2 satellite available via DigitalGlobe’s GBDX geospatial big data platform. The partnership intends to makeÂ new GBDX uses possible through theÂ integration of optical and radar satellite data, DigitalGlobe said Monday. The RADARSAT-2 satellite works to collect synthetic aperture radar data to helpÂ users see Earth […]
With todayâs proliferation of data, digital transformation (DX) has become more than a hot topic: Itâs an imperative for businesses of all shapes and sizes. The collision of data, analytics and technology has businesses, analysts and consumers excited â and scared â about what could happen next.
On one hand, everyone from banks to bagel shops and travel sites to tractor manufacturers have found new ways to connect the dots in their businesses while forging stronger, more dynamic customer engagement. Artificial intelligence (AI) has come of age in technologies such as smart sensors, robotic arms, and devices that can turn lights and heat on and off, adjust for changes in conditions and preferences, and even automatically reorder food and supplies for us.
However, today's Chief Analytics Officer (and Chief Data Officer and Chief Digital Officer, for example) faces both the promise and precariousness of digitizing business. While significant opportunities abound to drive revenues and customer connectivity, any leader will freely confess there are myriad technological, business and human obstacles to transforming even one element of business, introducing a new unique product or even meeting regulatory requirements.
The Big Data Dilemma
Big Data is at once the promise of the DX and its biggest roadblock. A recent Harvard Business Review article put it succinctly: âBusinesses today are constantly generating enormous amounts of data, but that doesnât always translate to actionable information.â
When 150 data scientists were asked if they had built a machine learning model, roughly one-third raised their hands. How many had deployed and/or used this model to generate value, and evaluated it? Not a single one.
This doesnât invalidate the role of Big Data in achieving DX. To the contrary: The key to leveraging Big Data is understanding what its role is in solving your business problems, and then building strategies to make that happen â understanding, of course, that there will be missteps and possibly complete meltdowns along the way.
In fact, Big Data is just one component of DX that you need to think about. Your technology infrastructure and investments (including packaged applications, databases, and analytic and BI tools) need to similarly be rationalized and ultimately monetized, to deliver the true value they can bring to DX.
Odds are many components will either be retired or repurposed, and youâll likely come to the same conclusion as everyone else that your business users are going to be key players in how DX technology solutions get built and used. That means your technology and analytic tools need to allow you the agility and flexibility to prototype and deploy quickly; evolve at the speed of business; and empower people across functions and lines of business to collaborate more than theyâve ever done before.
Beyond mapping out your overarching data, technology and analytic strategies, there are several areas to consider on your DX journey. Over the next three posts, Iâll focus on how to:
Visualize your digital business, not your competitorsâ
Unleash the knowledge hidden within your most critical assets
Embrace the role and evolution of analytics within your journey
To whet your appetite, check out this short video on the role of AI in making DX-powered decisions.
Looking for an open source option designed for horizontal scalability and fault-tolerence to manage your microservices? Camunda has just launched Zeebe, a big data system orchestrator, to helping you keep track of everything and anything.
Data is everywhere. In everything we do, in everything we see, data can be found in any place in the world that surrounds us. It is this concept that has led to the rise of new âBig Dataâ initiatives to try and harness all of this information. The issue with data is that if you gather too much, the signal you are trying to evaluate can get lost behind all of the noise. More importantly, perhaps, is that all of this data that is collected amounts to nothing unless the actual steps are taken to communicate this data to others, and to put it to use. At Nyaya, we have been working to get to that point of data communication.
Much of my work here this summer has been working on a new Data Communication Initiative. Through all of our programs, we collect boatloads of data, however it is often difficult to use it, and it is rarely ever communicated back to the staff who collect it. With the recent work that we have been doing, this is all changing. Through a new plan we have been working on putting into place, every month, new data will be hung up in the hospitalâs new Conference and Training center, as seen in the photo below.
Bulletin Board in New Conference Room
We presented the data at a data meeting which will now become a monthly event at the hospital. Since the bulletin board has been set up, dozens of staff members have taken time out of their busy schedules to come and check out the information, in order to help inform their actions moving forward.
Nyayaâs community health workers also collect a great deal of data from the communities they work in. As shown in the photo below, for the first time, Ashma, the Associate Director of Community Health, was able to show visualized data back to the community health workers, allowing for them to finally see the fruit of their labor.
Data Being Communicated to CHWs
Among these two projects, other plans are in place to provide weekly data to community health workers, to provide the clinical staff with data to supplement their daily lectures, and to use the help of our Globemed Chapter to write actionable reports on different data points.
Data may seem like an abstract concept, but it is real, usable information that can increase the care we provide to our patients, and the strength of our public health program. All that needs to be done is to take the time to tap into its potential. We are on the road to do just that.
ISS Netherlands wanted to improve its competitiveness in a crowded facilities services market. With the help of Microsoft technology including Windows Azure and Microsoft Office 365, the company has built a solution that automatically notifies workers when facilities need to be maintained, improving client service while reducing customer costs. We recently spoke with Martijn Jansen, Business Technology Manager at ISS Netherlands, to learn how this solution is transforming the facilities services business.
Q: Please tell us about ISS Netherlands.
Martijn Jansen: ISS Netherlands is part of ISS, which is a leading global provider of facility services. In the Netherlands, we offer services to hospitals, factories, government offices, and companies of all sizes. ISS Netherlands has 12,000 employees who work with a broad spectrum of clients ranging from small Dutch companies to large global enterprises.
Q: What challenges were you facing that led you to build a solution using Windows Azure and Office 365?
Jansen: Facilities services is a very competitive market, especially in the area of pricing. In the past few years, the focus has primarily been on pricing, since it’s been hard to differentiate oneself in other areas.
With Windows Azure and Office 365, we can easily scale up and down. This was important to us given the changing nature of our business. In addition, Windows Azure and Office 365 enabled us to innovate without a lot of risk to our company. ISS Netherlands has a relatively small IT department. We didn’t want to make a large investment that would require us to purchase and maintain our own hardware and software. Instead, we wanted to build a cloud-based solution that was both adaptable and flexible. Windows Azure and Office 365 met all of these needs.
Q: What solution did you build?
Jansen: In a traditional cleaning environment, for example an office building, the standard proposition is that you clean each rest room twice a day, five days a week, 52 weeks a year, whether it’s needed or not. So even if no one is there, you’d still clean the toilets twice a day. We wanted to improve our efficiency by only cleaning where it was needed, and doing so right when it was needed.
To do that, we created a simple “on-off” sensor that records every time a toilet, soap dispenser, or towel dispenser has been used, and then sends a message to our SQL Server database, which is run as a virtual machine within the Windows Azure cloud platform. We then created business rules so that after a specified number of visits, Windows Azure sends a message to all of the mobile phones used by the employees responsible for working in the area informing them that a particular toilet or towel dispenser needs to be serviced. Using Microsoft BizTalk Server, we send this information to our Financial Management Information System and to our customer SharePoint portals to keep track of our progress.
We are using the same system to measure whether a plant needs water, whether a door has inadvertently been left opened at night, or whether a mouse trap needs to be reset. The minute the threshold is reached, we know what has to be done in a specific room of a given building. So we're no longer doing a standard routine of cleaning—only when it’s required.
Q: That seems like a lot of information to track. How is the ability to process big data playing into your solution?
Jansen: The ability to cost-effectively process big data has made the solution possible. Each sensor captures multiple messages on an hourly basis, and we have thousands of sensors set up across all the facilities we manage. Our database holds several terabytes of data, and that quantity is growing exponentially with every customer we add. None of this would have been possible five years ago. The amount of processing power that we would have needed and the amount of data that needs to be stored would have made the investment cost-prohibitive.
Q: What role do Office 365 and SharePoint Online play in the solution?
Jansen: We’ve developed a customized portal for each customer using SharePoint Online, which they can view using Office Web Apps. The portal shows when we are coming to service the facility. It incorporates a map with all the inspection points, including mouse traps, toilets, towel dispensers, etcetera. It highlights all current issues as well as those that already have been fixed. And it displays the sensor data as easy-to-read bar charts searchable by topic via drop-down menus. This allows clients to get a high-level overview of what’s happening, while also reviewing the detailed data. So, for example, a manager can get a summary of pest control issues in his building, and then click through the portal to see the status of a specific room or even a specific trap.
We’ve also built a new feature into the SharePoint portal that allows customers to communicate with us. Initially, we’ll be deploying it at one of the hospitals we manage so that our client can inform us when a patient leaves and the bed needs to be made. Most Dutch hospitals have a shortage of beds, so it’s paramount that they be turned around quickly. Once the bed is made, our workers can send a message via their mobile phone back to the SharePoint portal. This will enable the hospital to put its bed space to more efficient use.
Q: How has your automated solution improved customer service?
Jansen: It’s improved our customer service by enabling us to offer a higher-quality service at a lower price. By using the sensors, we’re focusing on the jobs that need to be done at that moment and eliminating unnecessary work. For example, we used to visit one of our pest control contacts every week. But with the sensors, we now know the number of mousetraps going off without actually visiting the site. So we can now simply visit the site when a mouse is caught rather than checking whether or not it’s happened. That’s reduced our hours and thus our cost to the customer. What’s more, customers now feel we are helping them 24/7 rather than just a few days a month.
We’re also providing our customers with better information. With all the sensory data at our disposal, we can tell our clients what’s actually happening rather than using our gut feeling. Customers really like that because they now have precise data that they can present to their managers as well.
Q: What benefits has ISS Netherlands derived from the solution?
Jansen: It’s made us more competitive. By using web portals and other technology we’re on top of the league. With our automated solution, we’ve been able to serve more customers at lower cost with the same number of workers. But we’re not stopping there. Each day we’re examining how we can expand the uses of our solution to further improve our services. Using Microsoft technology, we’ve created a whole different way of managing facilities—one that’s highly efficient and provides top value for our customers.
Last Monday evening we had the first Singapore Oracle Sessions - an informal meetup of Oracle professionals thrown together at the last minute by a few of us.
Morten Egan (or as I believe he is called in Denmark now - The Traitor ) mentioned to me months ago that if there was no user group when we arrived in Singapore, then we should start one. At the time he was the current (now retired) chairman of the Danish Oracle User Group (DOUG, strangely enough) and, as I've presented at and supported various Oracle user events over the years and am an ACE Director, it seemed fitting that we should try to build something for the Singapore Oracle community.
The fact that the Oracle ACE Hemant Chitale works for the same company and that the ACE Director Bjoern Rost would be spending a few days at my place before continuing on to the OTN APAC Tour was too much of an opportunity. After a short chat on Twitter we decided to bite the bullet and I started researching venues and contacted some of the locals. We only had 6 days to arrange it so it was either brave or stupid!
As it came together and (through a few very good contacts) we had more and more attendees registering it started to seem like a reality and eventually Bjoern, Madeleine and I found ourselves walking along to the Bugis area on Monday, hoping for the best. Despite some initial problems finding the venue, we arrived to find the extremely helpful Sean Low of Seminar Room who took excellent care of us.
Within the matter of 15 minutes or so, 33 of the 36 or so who had registered were safely settled in their seats (including my other half Madeleine who *never* attends Oracle stuff!) for my brief introduction during which Sean insisted I try out the hand-held microphone.
My big Sinatra moment (not).
First up was Bjoern Rost of Portrix with "Change the way you think about tuning with SQL Plan Management" which, as those who've seen me present on the subject at Openworld, BGOUG or UKOUG would know is a subject dear to my heart. However, Bjoern seems to have had much more success with it than my failed attempts that were damned by literal values and Dynamic SQL. (I've since had a little more success, but mainly as a narrow solution to very specific problems.)
As you can see, the room was pretty full and the audience very attentive (except for a few people who appear to be mucking around with their phones!). They weren't afraid to ask some interesting and challenging questions too, which I always find very encouraging.
Early in Bjoern's presentation we suffered what I would say was the only significant disappointment of the night as both the drinks and the pizza turned up early! It was nice of the delivery companies not to be late, but my stupid expectation that 7pm meant 7pm ensured that I was standing at the back of the room surrounded by obviously gorgeous pizza that was slowly going cold, not knowing whether I should stop Bjoern in his tracks or not. Manners dictated not (particularly as there were so many people in a small room) but the pizza experience later suggests I was wrong. Lesson learned! (Note that I had to ask others about the pizza as it's on my extensive list of things I don't eat.)
What obviously didn't go wrong at all was the social interaction between all of the attendees and speakers. It probably helped that there were a few attendees from some organisations and that people from different organisations had worked with each other in the past but it's a *long* time since I've felt such a vibrant energy during a break.
I was on next, presenting on "Real Time SQL Monitoring" and apart from a few hiccups with the clicker I borrowed from Bjoern and a couple of slide corrections I need to make, I think it went reasonably well and people seemed as enthused by SQL Mon reports as I've come to expect! With that done, and a quick smoke (I *love* organising an agenda ), it was time for Morten with his "Big Data Primer"
I think this might have been lots of peoples favourite presentation because it wasn't just about Oracle and Morten packed in plenty of the humour I've come to expect from him. Better still, it seemed to work for a quite cosmopolitan audience, so good work!
Afterwards he said a few words asking for people's feedback and whether there was a desire to setup a local user group or just continue with these informal sessions (sponsors permitting) and all of the feedback I heard later showed that people are very keen for a repeat run.
Overall, Monday night felt like a great success.
The passion and enthusiasm of the attendees was very encouraging and reflected in the subsequent feedback which has been consistently positive but also thoughtful so far. There's no question that a decent minority of the local Oracle community are looking for regular opportunities to hear decent speakers on subjects that interest them, meet and discuss issues with each other and also offer to present themselves, which is a great start for any Oracle User Group.
Strangely, I discovered a day or so later that there are already plans for a User Group and the Singapore launch event is next Wednesday. Coincidentally this is only 9 days after SOS! You can look into the APOUG website here and a number of colleagues and I will attend the launch event. I suppose it's a small shame that it's an APAC-wide user group, rather than specific to Singapore, which the number of attendees at such short notice would suggest Singapore can justify, but I'll be interested to see what APOUG has planned.
Big thanks for Alvin from Oracle for endless supplies of fine pizza and Bjoern Rost of Portrix Systems for the room hire (I bought the drinks, which some would say was appropriate but I couldn't possibly comment) and thanks again to all the attendees for making it a fun night!
I didn't notice until I was about to post this that Bjoern had already blogged about the evening and I think he's captured it perfectly.
Easing the path for organizations to launch big data-styled services, Red Hat has coupled the 10gen MongoDB data store to its new identity management package for the Red Hat Enterprise Linux (RHEL) distribution.
Cyfryzacja, rozwÃ³j technologiczny, big data oraz machine learning pozwalajÄ coraz efektywniej korzystaÄ z dostÄpnych informacji i maksymalizowaÄ skutecznoÅÄ komunikacji z klientem. Jednym z rozwiÄ zaÅ, bazujÄ cych na jakoÅciowych danych, jest tzw. marketing transakcyjny, ktÃ³ry wedÅug 3/4 marketingowcÃ³w zastÄ pi dotychczasowe formy promocji, reklamy i lojalizacji klientÃ³w.
An Apache HTTP client "bug"/weirdness I ran into recently, which would end up consuming a large number of ephemeral ports (client side) instead or reusing connections - fixdescription. The ports would end up waiting in TCP_WAIT state for a long time and the client would eventually stop, unable to make any new requests.
Big data stuff. Naturally, any list is incomplete without big data:
Nutanix the storage vendor talking about how they use Cassandra. Ironic that they use a distributed, open source project when they themselves are in the distributed, expensive, hardware assisted storage area
Dear Desktop Engineering Reader: Are your workflows efficient, knowledge-driven and optimized for todayâs competitive engineering challenges like increasing product complexity, Big Data and new technologies? Chances are good that a lot of your workflows and their constituent tasks are really digitized versions of your fumbling first steps toward an end that used to work back ...
Facebook is growing rapidly from a fun space where people share their personal life journeys into a powerful search engine, breaking news source and entertainment platform. With itâs incredible reach, marketers have naturally gravitated to the platform to reach customers in new ways.Â Since the first friend request was made on the Harvard campus in […]
The real Billy Beane and Brad Pitt, who played him in 2011’s Moneyball I was the highest-rated amateur player in 1980, alongside Daryl Strawberry because I looked like a baseball player. They rated me based on all the things that got me elected homecoming king but didnât yield returns on the baseball fields. So began […]
Well the topic may seem like a pretty old concept, yet a vital one in the age of Big Data, Mobile BI and the Hadoops! As per FIMA 2012 benchmark report Data Quality (DQ) still remains as the topmost priority in data management strategy: âWhat gets measured improves!â But often Data Quality (DQ) initiative is Read More
I purchased an iPad for my grandparents (both 90+) a couple years ago so they could Skype my mum who lives in the UK.Since they don't have a landline internet connection (since they don't have a computer) I put a Skinny Data Only SIM card in their iPad and have been topping it up by 5GB ($60) every 3 months for them since they only Skype her once a week (occasionally twice) and use about an average 300MB per call (it can vary from 200-500MB).This has been perfect for them and despite being very anti-technology have warmed to their weekly Skype calls with their daughter that they haven't seen for 15 years and who is very ill and bed ridden and therefore cannot travel to see them.I log onto Skinny from wherever I am in the world at the time and top up and apply the 5GB Data pack to keep them in contact for the next 3 months, they don't need (or want) to know anything else about what a top up is or email or anything else related. They struggled just to work out how to use Skype and even now are scared of it however have worked out how to make a call and feel proud of themselves every time they use it.I have just logged into their account to check on their data usage and noticed that the 5GB Data Combo is no longer available, can anyone advise what has happened to it? There seems to be no mention of it on their website any longer.The only option I appear to see now is the Ultimate Combo (2.5GB data + other stuff that would be no use to them on an iPad only - i.e. calls & texts) however for the cost for this over 3 months would be $138 compared to the $60 it has been costing to date. There is another package below this, Big Data, however this only provides 1GB data per month where they use on average 1.7GB month with their weekly calls and overheads of app updates etc. so this would not suffice.. and if I were to tell them they need to reduce their Skype usage it would still come in more expensive on Big Data than before at $78 for 3 months.They still have data in their account to last another couple months max however I'm a bit stuck on what I am going to do after this.I would really appreciate to hear peoples feedback on what has happened to the Skinny Data Only combo's and also get advice on what to do in this scenario.
Looking for Big Data Rock Stars interested in an Internship at Bytes Johannesburg. Main Purpose of the Job (In one sentence).... From The Altech Group - Tue, 01 Aug 2017 11:27:47 GMT - View all Johannesburg, Gauteng jobs
Big Data is feeling the heat. Among others, the heat of IoT and the heat of Big Data Application Performance Management. In this post, we’re going to look at how Big Data is revolutionizing how it deals with... Read More
Big data â or the notion of it â is one of the more significant issues confronting todayâs nonprofit leaders. From large, international enterprises to single entrepreneurs, information about customer transactions, communications connections and purchase preferences exists from a broad array of sources. Some organizations have synthesized their data into meaningful insight that drives or transforms their business. Netflix, for example, gleaned so much value from its subscribers and their habits that it was willing to make an educated roll of the dice on producing original content â and changed the media landscape in the process.
However, Netflix is an exception, perhaps even a rarity: It employed technology, data and the strength of its convictions to leverage big data to its advantage. For nonprofits, technology complexity, the sheer size of the amount of data and the understandable reluctance of leaders to base their futures on this mix stops big data in its tracks. This is not a new occurrence â thousands of research reports and customer behavior overviews gather dust in office bookcases and have done so for decades. The tools behind big data just make the problem bigger.
Perhaps the time has come to apply a more useful and manageable version of big data and to focus these data and technology advances toward specific, fundamental questions nonprofit leaders ask themselves about the viability of their organizations. At face value, this might sound like a typical research project, but the value lies in the approach. The keys are:
â¢ Gaining access to a manageable base of customers and prospects
â¢ Utilizing analytical tools that reflect todayâs level of technological sophistication and ease of use
â¢ Asking the right questions in a straight-forward manner based on standard research protocols
â¢ Assembling the results in an easy-to-understand format, focused on action â in other words, presenting insights that can be employed to refine and grow the organization
Why do people go to museums?
This might sound like an ethereal question for the ages, more philosophical than practical. But the arts in the 21st century are facing systemic challenges: competition, funding, even access (through technology). Leaders of the Association of Art Museum Directors (AAMD) addressed this issue during a seminarat the Aspen Institute last spring), where they asserted that âthe model of the art museum has never been more challenged and in need of creative re-imagination.â In addition, these challenges have become more vexing since there are no current comprehensive best practices guidelines for this environment â and they recognize that potential solutions may be based on each museumâs unique market and audience.
Based on these challenges, I collaborated with an art museum to determine, based on their attitudes and motivations, why patrons and members visit. The research was based on the work of John Falk The Museum Experience, in which he suggests that visitors fall into five categories according to their motivations: curiosity, social interests, experience-seeking, art as a hobby and emotional recharging.
To apply Falkâs hypothesis to the museum, we polled through e-mail more 1,300 respondents associated with the museum and the local community. These respondents were asked a series of 15 attitudinal questions about their habits and to rank their agreement on a 1â5 scale. In addition, they were also polled for demographic information and their participation in other performing and visual arts. The data was analyzed with Qualtrics, a leading U.S. online survey research firm.
The museum found that, beyond the art itself, that their patrons regarded a visit as a holistic immersion in an overall experience. Among the top responses, patrons felt the museum:
â¢ Was valuable because âit is an important part of the communityâ
â¢ Was an integral part of their âoverall interests in the artsâ
â¢ Provided âan insiderâs view of artâ and the artistâs creative process
â¢ Contributed to the social connectedness of friends and family
Why did this data matter to the museum? First, it validated many of the museumâs activities, including membership, educational initiatives and special events. Second, it gained specific direction to enhance offerings of value to its patrons and prospects. Third, it developed a framework to promote holistically its unique artistic experience, reflecting the demographics and the culture of the community. All these insights were new to the museum and would not have been available without the speed and depth of analysis provided by current data gathering and technology. But most importantly, the museum took the first step to re-visioning its role as an artistic entity and important element of the local community.
If a focused approach to big data can work for an art museum, it has possibilities in your organization as well.
John Klein is principal of Trilithon Partners, a marketing consultant firm for nonprofits.
Each year around this time the analysts at Gartner go to work on a series of widely read reports called Hype Cycles that summarize the maturity and velocity of a staggering number of technologies. Spoiler alert: this year, we willâ¦
As revolutions go, the Big Data Revolution is a tough one to get really passionate about. Revolutions work best when they rally around simple, inspiring principles: freedom, equality, really handy devices…. Â But Big Data, which Gartner defines asÂ “high-volume, high-velocity andâ¦
Change, innovation, progress, while these terms should always be associated with the positive, for marketers entrenched in their current methodologies, the future can seem down right scary as the lines blur. The ways in which consumers can now discover, consume and engage about products and services, makes it a challenge for marketing professionals to keep up.Â It only makes sense that with dynamic change, some concepts fall by the wayside, while others emerge to further our goals. Here are some pertinent predictions for 2013. Is outbound marketing over? 2012 saw the continued growth of social media and the âBig Dataâ [â¦]
CILIP North East invite you to a debate on user privacy in libraries held at the Mining Institute in Newcastle.
The motion to be debated is:
This house believes that protecting users' privacy in libraries should take precedence over any other demands on users' data.
Debate chair: Dr Biddy Casselden
Biddy is a Senior Lecturer at Northumbria University, in addition to being a programme leader for the MA/MSc Information and Library Management DL course. Prior to becoming an academic, Biddy worked in a variety of information management posts.
Proposing the motion:
Ian Clark is a subject librarian at the University of East London and a co-founder of Voices for the Library. Involved in the Radical Librarians Collective, he is active in encouraging awareness of privacy and intellectual issues in libraries and last year wrote a paper on the relationship between surveillance and the digital divide.
Alex is a recent law graduate and continuing academic, specialising in technology and internet law. He currently acts as organiser for the Open Rights Group's local North East group, and helps to organise events teaching practical digital security with CryptoParty Newcastle.
Opposing the motion:
Robin Smith, Head of cybersecurity at West Yorkshire Police
Peter is an experienced Data Protection Officer, currently working in the Information Security Team at Newcastle University. He is studying for an LLM in Information Rights Law and Practice at Northumbria University, with an interest in the legal and privacy implications of the use of big data in learning analytics in higher education.
This debate is open to anyone interested in libraries and/or privacy.
CILIP North East members are invited to attend the group's AGM prior to the debate starting (while everyone else gets to admire the beautiful Nicholas Wood Library or sip on their drink a bit longer!)
With limited time and floods on inbound RFPs to respond to, proactive group prospecting can be hard. However, it's critical that hotels sell proactively to keep up with increased competition resulting from new inventory coming into nearly every US market. Independent hotels often have to work harder to sell their property without the backing of a brand affiliation, but big data can make proactive selling easier, more efficient, and more effective. Kerry Kuhl, the National Sales Manager at The Kahala Hotel & Resort, demonstrates how she successfully generates new group business using group data and booking patterns to develop time-saving tactics and do a lot of the heavy lifting for her.
La exposiciÃ³n fotogrÃ¡fica de Lewis Baltz que se exhibe en la FundaciÃ³n Mapfre hasta junio del presente aÃ±o se ha concebido como una gran retrospectiva que nos permite disfrutar de la obra de este gran fotÃ³grafo, poco visto en nuestro paÃs.
Lewis Baltz falleciÃ³en ParÃs, quizÃ¡ en un dÃa de aguacero, habÃa llegado a Europa en los aÃ±os ochenta huyendo de la era Reagan, cambiando su registro y su paleta desde un EEUU hoy conmocionado bajo la era Trump. Sus reflexiones acerca de las nuevas tecnologÃas, la industrializaciÃ³n desenfrenada, junto a los movimientos de la poblaciÃ³n, que conducen a la destrucciÃ³n el planeta, ocuparon el nÃºcleo central de su trabajo.
Cuando la fotografÃa era entendida todavÃa como un oficio Ãºtil para registrar casas, cosas, sucesos o personas, Baltz ya empleaba la fotografÃa como recurso para interpretar la realidad y trabajar con el paisaje herido por el paso del hombre. No aparecen seres humanos en su fotografÃa, sÃ³lo su huella que construye el paisaje.
Lewis Baltz es un artista conceptual que emplea la fotografÃa, de una resoluciÃ³n formal impecable, para relatar la destrucciÃ³n y la desolaciÃ³n del paisaje.
Su formaciÃ³n como fotÃ³grafo se sitÃºa entre Art Institute de San Francisco y la Claremont Graduate School de California. Hace ya mÃ¡s de 40 aÃ±os que un Lewis Baltz de 26 aÃ±os comenzÃ³ a construir la coherencia de su trabajo avalado por la GalerÃa Leo Castellidesde su primera exposiciÃ³n individual en 1971,Tract Houses, y sobre todo mediante su inclusiÃ³n en 1975 en el grupo New Topographic.
Lewis Baltz, Tract House #22, from the portfolioThe Tract Houses, 1971
FotÃ³grafos a quienes se dotÃ³ de capacidades artÃsticas situÃ¡ndolos en el mismo plano que a los escultores o que a los artistas pioneros del Land art. En 1990, los Becher recibieron el premio de escultura en la Bienal de Venecia, que si bien no estaba aÃºn constituida una categorÃa especial para la fotografÃa, pero si no se hubiera valorado su obra dentro del campo artÃstico como un hÃbrido entre escultura, arquitectura y fotografÃa, ni siquiera habrÃan sido considerados para ser premiados.
Este grupo de fotÃ³grafos que se rige bajo el espÃritu de la New Topographie es descrito como carente de emociÃ³n, pero es tan solo que fulminaron la idea romÃ¡ntica del paisaje, por mÃ¡s que aÃºn haya quien quede inevitablemente suspendido ante una puesta de sol saturada de ocres, naranjas, rojos y amarillos.
El primer Baltz trabaja en blanco y negro, se ocupa de las relaciones culturales del hombre con el paisaje, de esos lÃmites inestables entre lo urbano y lo rural, y como los territorios ignotos se ven alterados por la industrializaciÃ³n y la relaciÃ³n que se establece al descubrir el valor urbanizable de la tierra, que nos recuerda a esa mÃ¡xima de de Richard Cantillon en el siglo XVIII que parece seguir vigente de que la riqueza estÃ¡ en la tierra.
Con The Power Trilogy (1992-1995) sigue ahondando en los usos de las nuevas tecnologÃas como mecanismos del poder, ya sean las mÃ¡quinas de vigilancia, o la vulnerabilidad ante las mÃ¡quinas de la nueva medicina. El Big data actual. El gran ojo que ni siquiera Foucault podrÃa haber llegado a soÃ±ar.
Al mismo tiempo la materializaciÃ³n de sus trabajos, desde el blanco y negro y el formato mÃ¡s o menos manejable, va derivando en site-specific y grandes instalaciones como Ronda de noche presentada en 1992 en el Centro George Pompidou, donde establece una clara conexiÃ³n con la monumental pintura de Rembrandt (1642).
El monumental lienzo de Rembrandt recoge el momento en que la compaÃ±Ãa militar del CapitÃ¡n Frans Banninck Cocq y el teniente Willem Van Ruytemburgh se prepara para iniciar la ronda nocturna por la ciudad de Ãmsterdam en la que han de velar por mantener el orden. Los arcabuces, y lanzas son sustituidos por cascadas de fibras de vidrio y paneles de ventanas encendidas con luz artificial. Los rostros de los arcabuceros individualizados por el artista para ser reconocibles, son sustituidos por un Ãºnico rostro robÃ³tico sin identidad.
My company, Insight Engines, recently announced Series A funding, to make big data easily queryable by everyone. We’re bringing natural language technology to the cybersecurity domain, so you can use plain english search queries to navigate large datasets for security investigations. If you’re also interested in the intersection between NLP and cybersecurity, we’re hiring.
DataFlair Web Services Pvt Ltd [www.data-flair.training] Openings For Freshers : BE/ BTech/ BCA, Bsc, BCA, MCA, MSc – 2016/2017 Pass outs : Technical Content Writer @ Indore Exclusive Job For PresentJobs.com Company Profile: We are leading providers of online training on latest niche Big data technologies like Hadoop, Flink, Spark, Scala, HBase, Kafka, Storm etc. ...
Alright! You probably have been hearing a lot about Big Data and Data Scientists etc. The big data craze was actually in full swing when the Harvard Business Review published an article 3 years ago with the title stated that "Data Scientist: The Sexiest Job of the 21st Century". And in order to become a […]
Conozca las pautas mÃ¡s adecuadas para implementar proyectos de Big Data, gestionando de la manera mÃ¡s correcta la informaciÃ³n de la que dispone una empresa a fin de alcanzar resultados y tomar decisiones en lÃnea con la estrategia de negocio. Vea el "DÃa de Big Data de IDGtv. Entendiendo los datos". 24 de febrero de 2015.
Vea "El dÃa del Big Data. Negocio e innovaciÃ³n a partir del dato", una sesiÃ³n en la que se abordan las mejores prÃ¡cticas para gestionar, aprovechar y medir el impacto de toda la informaciÃ³n que rodea a la empresa.
Companies are producing massive amounts of dataâotherwise known as big data. There are many options available to manage big data and the analytics associated with it. One of the more popular options is Apache Hadoop, an open source software designed to scale up and down quickly with a high degree of fault tolerance. Hadoop lets organizations gather and examine large amounts of structured and unstructured data.
How a Small Big Data Company Could Help Save Thousands of Lives on American Highways and Save Millions of Dollars for the Trucking Industry SURREY, BC / ACCESSWIRE / June 7, 2016 / DSG GLOBAL INC. (DSGT), a proven industry … Continue reading →
A programaÃ§Ã£o completa (com sinopse de cada palestra/oficina/keynote) pode ser vista aqui.
10h - 11h - GNU/Linux - It is not 1984 (or 1969) anymore - Jon “Maddog” Hall
SIMULANDO FENÃMENOS COM O GEOGEBRA - Marcela Martins Pereira e Eduardo AntÃ´nio Soares JÃºnior
12h - 13h - EspaÃ§os abertos colaborativos Guilherme Guerra
13h - 14h - (comer alguma coisa) e tentar me dividir entre: O analfabetismo tecnolÃ³gico e a formaÃ§Ã£o dos professores Antonio Carlos C. Marques e Internet das Coisas: Criando APIs para o mundo real com Raspberry Pi e Python Pedro Henrique Kopper
14h -16h - Abertura Oficial da Latinoware
16h - 17h - EdiÃ§Ã£o de vÃdeos na prÃ¡tica com kdenlive Carlos Cartola
17h - 18h - red#matrix, muito mais que uma mÃdia social. Frederico (aracnus) GonÃ§alves GuimarÃ£es
10h - 11h - Direitos autorais e os cuidados ao utilizar serviÃ§os “da nuvem” e “gratuitos” para construir objetos educacionais MÃ¡rcio de AraÃºjo Benedito
11h - 12h - ColaboraÃ§Ã£o e Ferramentas Livres: possibilidades de contra-hegemonias na Escola. Sergio F. Lima
12h - 13h - Professor Livre! O uso do software livre nas licenciaturas. Wendell Bento Geraldes
13h - 14h - (comer alguma coisa) e PadrÃµes abertos de documentaÃ§Ã£o - ODF. Fa Conti
NÃ£o apenas um pouquinho. Eles sÃ£o, agora, parceiros em cada bit no “Big Data” com a aprovaÃ§Ã£o do Google e Facebook. Talvez pior, pois ele (o governo dos Estados Unidos) supostamente em breve serÃ¡ capaz de armazenar cada pedaÃ§o de dado que jÃ¡ passou por toda a internet - para sempre, e teoricamente poderÃ¡ quebrar qualquer criptografia. (…)
The objective of this webinar is to review types of datasets commonly used in this field, discuss potential pros and cons of each type depending on research question, and identify other key considerations. To gain insights on this topic, join the live webinar broadcast taking place on Thursday, August 21, 2014 at 2pm EDT.
Historical records are fascinating arenât they? Iâm not the historical type, but my wife is, and sheâs the one who professionally works within the family tree arena. I kind of look at these things with a glazed look to my eyes, and itâs very evident Iâm rather uninterested in historical time-lines.
But late last year my wife sat me down (yes I was coerced) and she systematically showed me the structure of my family name tree. For the first time I had a very relaxed feeling as my wife took me through the past four centuries of her detective work relating to my family name.
It was fascinating and I eventually got it. I appreciated her work and was so interested in her system of operation while she whittled out the false avenues she often went down eventually arriving at a proven source and then moving on to establish a new historical avenue to investigate. The potentials and probabilities are so captivating. It is true detective work.
And this got me thinking about historical time-lines and their significance.
I asked myself, and now you the reader: what will historical records say? Of course as always I am speaking directly about the visual mapping arena.
For the past 50 years or so weâve been introduced to the formalizing of hand drawn Mind mapping by the one and only Tony Buzan. And of course weâve since then witnessed and experienced an exponential expansion of the original thought, method and evolved expressions of the original methodology.
The constant though is: Itâs all based on Buzan Mind mapping, no denying that at all. And I canât help bend the knee of respect to Tony Buzanfor formalizing what was named Mind mapping into a structure that has indeed changed the lives of numerous adherents to this fascinating synaptic tool.
And for those who would suggest Mind mapping has been around for millennium; well theyâre right as âHistorical recordsâ do prove a form of mind mapping has been around throughout human history.
The works of Roy Grubb are in my opinion such an important historical record relating to all things graphical data, information and knowledge mapping. Go sift through his awesome work and soak in the information and knowledge Roy presents at his domain.
Mind mapping became Visual mapping by virtue of the inclusion of multiple graphical formats being added to the original radiant Buzan approach.
And: Visual mapping has by virtue of this digital age morphed (even evolved) into knowledge mapping. And of course it is evolving continuously along with technology.
It does seem we are at an epoch of great advancement, even to the point of the inclusion of enhanced reality and Artificial Intelligence being added to the mix that seems to be taking the original format down the rabbit hole of mind bending possibilities. What those possibilities are? We can only surmise at this time, but suffice to say it may indeed be mind bending.
So: Historical records are important, and as we are exponentially pushing forward into uncharted territory with the help of big data, AI and augmented reality systems, we must needs establish, support and enhance a true Historical record of where this arena started (or was rebooted) from. Letâs make sure weâve got an accurate point of reference for an historical record.
What do you believe the historical records shall say about where this arena had its actual genesis and where it has been and where itâs going.
As CEO of SimTech Systems Inc I'd like to share my thoughts with you.
Yes we've been the developer of MindMapper and Thinkwise for the past 20 years. And regarding the question of visual thinking arena's past, present, and future?
Visual thinking is one of the thinking attributes born to all human beings. However, there are different methods and degrees of visual thinking, but in all, everyone, whether they know it or not, thinks visually.
Visual mapping software is an excellent tool that helps users to think creatively and logically utilizing the tree structure as a visual backbone. However, we must realize that visual mapping is just one method out of many that facilitate visual thinking.
Let's look at the advancement in the visual mapping arena from the software perspective only. Computer generated mind map was available in the mid-1990s. Digital mind map market really opened up when the market received with enthusiasm MindMapper'skey feature of converting a map to an MS Word document for the first time at Comdex 2001.
Before and after 2010, we witnessed increased expansion of creative thinking at the team level by utilizing collaboration. And currently, individual and teams are using mobile and the cloud for visual mapping needs.
The biggest contribution visual mapping has made is that visual diagramming skills only available to professionals were now accessible to many to help simplify and expedite their daily activities. In short, it has fundamentally contributed to achieving happiness and success by utilizing visual and whole brain thinking capabilities given to all human beings.
A lot of people are still puzzled and want to know why the visual mapping software industry has not solidified itself as mainstream. But, one must ask if such question or the expectation is valid and beneficial.
A lot of people cannot distinguish the difference between visual mapping and mind mapping. Visual thinking and visual tools had been around even before mind map existed and they are not limited to a tree structure visualization.
Visual mapping products, when they came to market, emphasized main benefits and positive effects inherent only to mind mapping. As a result, mind mapping became a buzzword to the general public equating it to visual thinking.
Many visual mapping software developers from 2000 to 2005 have used mind map for their marketing efforts. However, as time passed, many have realized that mind mapping software is nothing more than information visualized in a tree structure and started to lose interest in the whole arena. On top of that, it is tough to replicate the original mind map's memory benefits from a digitally created map. Ironically, mind map has contributed to the rise of the visual mapping market, however, in the process, it has contributed to the loss of visual mapping identity and definition.
Salt is used in almost all food. However, salt is never presented as food itself. Visual thinking is God-given natural talent that everyone uses in their daily activities. But, mind mapping technique or tree-based visual mapping is just a small part of visual mapping arena. We all know of a visual mapping developer who positioned itself as a project management tool in the early 2000s as a way to distinguish itself from the rest of the industry but had failed to emerge as a viable project management tool.
Mind mapping or any tree-based visual mapping software can work together along with groupware, ERP, and such, to create synergy, but they can never become a substitute or replacement. From this perspective, I foresee future products integrated with the mainstream products or evolve to combine technology such as DB, Big Data, and AI with visualization using visual mapping.
MindMapper also has contributed to the growth of visual mapping arena for the past 20 years. "Value creation through innovation" is MindMapper's direction for the next stage of development. To create value, you need to ideate and execute. Our goal is to evolve into a more comprehensive tool that can facilitate idea to action. And our current iteration, MindMapper16 is the first product development process accomplishing this objective.
Young G Chung CEO, SimTech Systems, Inc. Developer of MindMapper
Big data introduces data storage and system performance challenges.
Keeping your growing tables small and efficient improves system performance as
the smaller tables and indices are accessed faster; all other things being
equal, a small database performs better than a large one. While traditional
data purge techniques work well for smaller databases, they fail as the
database size scales up into a few terabytes. This tutorial will discuss an
algorithm to efficiently delete terabytes of data from the DB2
Satellites and GPS technology are tools that have reached big industry. Take railroad for example. One piece of technology will replace 500 employees when indicating a hot box detector. Bad for employees good for stockholders. Is this beneficial for society? Only time will tell. A new farming trend is precision farming. Goodbye to conventional farming? […]
BIG DATA DOESNâT PAY Â Software programs that use police records to predict crime hot spots may result in police unfairly targeting low-income and minority communities, a new study shows.
Big data is everywhere these days and police departments are no exception. As law enforcement agencies are tasked with doing more with less, many are using predictive policing tools. These tools feed various data into algorithms to flag people likely to be involved with future crimes or to predict where crimes will occur.
In the years since Time magazine named predictive policing as one of 2011âs best 50 inventions of the year, its popularity has grown. Twenty U.S. cities, including Chicago, Atlanta, Los Angeles and Seattle are using a predictive policing system, and several more are considering it. But with the uptick in use has come a growing chorus of caution. Community activists, civil rights groups and even some skeptical police chiefs have raised concerns that predictive data approaches may unfairly target some groups of people more than others.
New research by statistician Kristian Lum provides a telling case study. Lum, who leads the policing project at the San Francisco-based Human Rights Data Analysis Group, looked at how the crime-mapping program PredPol would perform if put to use in Oakland, Calif. PredPol, which purports to âeliminate profiling concerns,â takes data on crime type, location and time and feeds it into a machine-learning algorithm. The algorithm, originally based on predicting seismic activity after an earthquake, trains itself with the police crime data and then predicts where future crimes will occur.
Lum was interested in bias in the crime data â not political or racial bias, just the ordinary statistical kind. While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.
By applying the algorithm to 2010 data on drug crime reports for Oakland, the researchers generated a predicted rate of drug crime on a map of the city for every day of 2011. The researchers then compared the data used by the algorithm â drug use documented by the police â with a record of overall drug use, whether recorded or not. This ground-truthing came from taking public health data from the 2011 National Survey on Drug Use and Health and demographic data from the city of Oakland to derive an estimate of drug use for all city residents.
Story continues below maps
Drug use in Oakland is probably fairly widespread (left) based on estimates derived in part from the 2011 National Survey on Drug Use and Health. But police records of drug reports and crimes are concentrated in areas that are largely nonwhite and low-income (right).
In this public health-based map, drug use is widely distributed across the city. In the predicted drug crime map, it is not. Instead, drug use deemed worthy of police attention is concentrated in neighborhoods in West Oakland and along International Boulevard, two predominately low-income and nonwhite areas.
Predictive policing approaches are often touted as eliminating concerns about police profiling. But rather than correcting bias, the predictive model exacerbated it, Lum said during a panel on data and crime at the American Association for the Advancement of Science annual meeting in Boston in February. While estimates of drug use are pretty even across race, the algorithm would direct Oakland police to locations that would target black people at roughly twice the rate of whites. A similar disparity emerges when analyzing by income group: Poorer neighborhoods get targeted.
While drug use estimated from public health data is roughly equivalent across racial classifications (top), police using a predictive policing algorithm in Oakland, Calif., would target black people at roughly twice the rate of whites (bottom).
And a troubling feedback loop emerges when police are sent to targeted locations. If police find slightly more crime in an area because thatâs where theyâre concentrating patrols, these crimes become part of the dataset that directs where further patrolling should occur. Bias becomes amplified, hot spots hotter.
Thereâs nothing wrong with PredPolâs algorithm, Lum notes. Machine learning algorithms learn patterns and structure in data. âThe algorithm did exactly what we asked; it learned patterns in the data,â she says. The danger is in thinking that predictive policing will tell you about patterns in the occurrence of crime. Itâs really telling you about patterns in police records.
Police arenât tasked with collecting random samples, nor should they be, says Lum. And thatâs all the more reason why departments should be transparent and vigilant about how they use their data. In some ways, PredPol-guided policing isnât so different from old-fashioned pins on a map.
For her part, Lum would prefer that police stick to these timeworn approaches. With pins on a map, the what, why and where of the data are very clear. The black box of an algorithm, on the other hand, lends undue legitimacy to the police targeting certain locations while simultaneously removing accountability. âThereâs a move toward thinking machine learning is our savior,â says Lum. âYou hear people say, âA computer canât be racist.ââ
The use of predictive policing may be costly, both literally and figuratively. The software programs can run from $20,000 to up to $100,000 per year for larger cities. Itâs harder to put numbers on the human cost of over-policing, but the toll is real. Increased police scrutiny can lead to poor mental health outcomes for residents and undermine relationships between police and the communities they serve. Big data doesnât help when itâs bad data.
Aug. 05--Charlotte trails national averages on transportation and land use patterns while showing improvement in energy use and other local measures, says a first-of-its-kind sustainability report card released Tuesday. The nonprofit group Sustain Charlotte examined data trends in nine categories to produce the report, which is aimed to help local governments set goals and create policies. "We're living in a time when more and more we're making decisions using big data," said Shannon Binns, Sustain Charlotte's executive director. "It seems important to have an understanding of whether we're making progress on these issues." The report assigns two grades for each category, the first measuring local trends and the second a comparison to national averages. The county's best grade was for water usage, in which Mecklenburg was given a B when compared to the nation as well as a B for its own usage trend. Since the 2007 drought, water usage has declined significantly, which is one factor in a number of rate increases by Charlotte Mecklenburg Utilities. Mecklenburg fared worse on land use and transportation, getting Ds in both when compared to the nation. The trend line for transportation was better, with a B. The report said the Charlotte metro area has been ranked as the fifth most sprawling area in the nation, and that the amount of land used for parks, on a per capita basis, has decreased since 2007. In terms of transportation, Mecklenburg received high marks for an increase in people using public transportation to get to work (2.5 percent in 2000 to 3.8 percent in 2011), as well as the construction of the Lynx Blue Line. But the report card noted that the area still trails national averages in terms of people who commute by biking, walking or taking public transportation. Charlotte is building a $1.1 billion extension of the Lynx Blue Line to University City. But the Charlotte Area Transit System doesn't have enough money left to build other large rail projects. Among the trends the report details: The number of local families and children living in poverty doubled between 2000 and 2011. Transportation costs are taking larger chunks of personal income. Sixty neighborhoods are "food deserts." Sprawling land development continues. "Overall what this report shows is that there are very few areas in which we are making dramatic strides forward and outshining the national averages," Binns said. Charlotte City Council member John Autry, who chairs the city's environmental committee, said the region can improve. "Are we a leader (in these areas)? Not today," Autry said. He said the region could make significant improvements, including a "pay as you throw" program in which residents pay for how much garbage they throw away. "That would have a significant impact," he said. County manager Dena Diorio said Mecklenburg's goal is to have a park within a 5- to 10-minute walk of all residents. The report is aimed at local decision-makers, but the group hopes to also influence individual choices. It's also intended to serve as baseline data for residents involved in the Mecklenburg Livable Communities Plan, which will develop community goals. Sustain Charlotte makes recommendations for each category, with some drawn from sources such as Mecklenburg County's biennial State of the Environment report. The Z. Smith Reynolds Foundation provides operating grants to Sustain Charlotte. The Davidson College Sustainability Scholars program provided intern Jordan Luebkemann, who helped compile the report's data. The report's other authors include Sustain board member Jennifer Fairchild, staff member Meg Fencil and Binns. Copyright 2014 - The Charlotte Observer
With the expansion of the digital gold rush, data is moving into the spotlight and becoming a valuable source of information. Estimates are that the digital universe will continue to double every two years at least and reach 44 zettabytes by 2020, 50-fold growth compared to 2010. The sheer size of the data lake is staggering, but the million-dollar question remains; that is, how to make sense out of the data tsunami and capitalize on it.
Storage costs keep plummeting
The phenomenon referred to as Mooreâs law has been observed for decades, and with the emergence of new technology (SSD, SW-defined storage, object storage, etc.) as well as the consolidation within the storage industry, the price spirals keep heading south with a double-digit decline year-on-year.
Thanks for your comments and a great suggestion regarding property data. It's a topic I have on my 'potential episodes' list and it's a tricky one too - as even amongst the 'stats experts' they interpret things and calculate things differently!
I'm currently trialling a few different data providers and so have some contacts at the big data providers as a result of this - so I'll see what I can tee up!
KUALA LUMPUR, 6 December 2016: According to market research Market Intelligence & Consulting Institute (MIC) in Taiwan, market for big data information related applications has already exceeded 45 billion US dollars in 2016. The market has grown tremendously in the last few years and big data information processing will become a main driving force for [...]
KUALA LUMPUR, August 9 – The overwhelming impact of the development of cloud, big data analysis, mobile and social networking applications, etc. has led to global wide digital transformation. A visiting delegation of seven (7) leading Taiwanese ICT companies and a big group of Malaysian ICT companies, including local telecommunication company of great renown, governmental [...]
Paper No. 5856Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Dated 12-Jan-2015
Guest Column by Dr. Rajesh Tembarai Krishnamachari and Srividya Kannan Ramachandran
The re-moderation of the world economy set in place over the past few years continues apace. Notwithstanding some lasting damage on the supply side through the 2008 recessionary trough, our outlook for 2015 is bullish weighing more on optimistic data trends than on continued negative sentiment proffered from some analyst quarters.
Around the world in 80 (or more) words:
Treating the ten-year US Treasury bond yield as a proxy indicator for that nation's nominal GDP growth, we anticipate United States to grow around 3% next year. While this does not mark a return to the buoyant 90s, it is better than the secular stagnation hypothesized earlier in 2014. With US acting as an engine to spur growth, the world economy should also expand by more than 3%. Stability across the world will be maintained â as sparks without a concomitant fury will characterize both overt (e.g. Russia-West over Ukraine) and covert (e.g. China-Japan over Senkaku) animosities. European stagnation from debt and unemployment will be counterbalanced through quantitative easing by the European Central Bank. Similar action in Japan will display the limits of Abe-nomics. China will prepare for a structural slowdown emphasizing domestic consumption and de-leveraging an over-heated financial sector; all the while growing at a 7% rate that will amaze rivals around the world. Indian reform, even if inadequate, will boost the middle classes and reinforce confidence in the Modi government. African countries will find their commodity boom dissipate and ease of borrowing decline as commodity prices fall and yields rise in the developed world.
a. North America:
Economic benefits arising from the exploitation of shale gas have not only silenced the anti-fracking environmentalists, they have altered the strategic world-view of Washington politicians. As US aims to overtake even Saudi Arabia in oil/NGL production in 2015 (and the Saudis pull out all stops in preventing it by driving crude prices down), it has markedly reduced its role as a global policeman. Its own economy is on the mend even as a lame-duck president will be boggled down with partisan grid-lock. Markets will fret about the mid-year (or earlier?) hike in interest rates; though Main Street - aided by a strong dollar - will likely shrug it off with a continued upward movement across different sectors.
Mexico and Canada will benefit from their tight coupling with the United States. Enrique Pena Nieto will claim credit for reforming the Mexican economy â across sectors as diverse as energy and telecom. Pemex, dear to the Mexicans, will face some competition, though nothing remotely similar to the American acquisition of Tim Hortons â dear to the Canadians â will happen. Up north, the Canadian elections in 2015 will reveal whether the country has reverted to its liberal propensities or sticks with Harper's conservative agenda.
b. Latin and South America:
The outlook is disappointing across much of the region. Run-away inflation hammers Argentina and Venezuela; milder ill-effects bedevil Brazil, Bolivia and Uruguay. The Maduro regime in Venezuela and the Kirchner government in Argentina continue to flirt with disaster as their GDP growths slip and mass discontent builds up. Dilma Rousseff has stabilized her position electorally, though her policies continue to disappoint investors and have the potential to reignite sudden protests like the 2013 bus-fare protests. Dependence on commodity exports in a time of declining prices does not portend well for any of the South American states, including Brazil. On a positive note, Cuba â already expected by analysts to grow by close to 4% next year â will see a boost to its fortunes accruing from a thaw in relations with US under Obama.
African nations had a great run in the past few years. This arose not only from the boom in commodity prices but also from the need for yield amongst DM (developed market) investors resulting in investment in both corporate and public African bonds. In 2015, these factors could dissipate which will place pressure on countries like Angola where household spending has risen more than 4000% since the start of the millennium. Ethiopia and Kenya are expected to continue on a robust growth path. Contradictions abound within Africa, and nowhere are they more visible than in Nigeria. While the northern part struggles under the oppression of Boko Haram, the southern part booms under Goodluck Jonathan's president-ship. In neighboring South Sudan, one is reminded of the risk-reward payoff as the nation widely tipped to experience spectacular growth in 2014, got mired in conflict, with the consequent dissipation of growth potential.
American intervention in Libya undermined the Gaddafi-imposed order and has led to a civil war between the Islamist and secularist factions which will hold back that nation in the coming year. A more benign intervention was that of the French in Mali in 2013; we expect more calls for Hollande's assistance in 2015. El Sisi has stabilized Egypt after the Muslim Brotherhood interlude in the post-Mubarak era. Though more brutal than Mubarak, the El Sisi regime is being propped by both the Americans and Saudis, leading us to expect the recent bull run in Egyptian markets to continue. ANC rule in South Africa continues unimpeded. Though atrophied by many scandals, the rule should produce close to 3% growth in the coming year.
d. Middle East:
The region continues to be a cesspool of ethno-sectarian rivalries as the century-old Sykes-Pikot agreement unravels. Recep Erdogan has stabilized Turkey and should reap a growth on par with other emerging economies. Erdogan's external actions driven by AKP's crypto-desire to establish a caliphate will see him prop the Islamic State (IS) just so that it can damage Shia and Kurdish interests; but not enough to threaten his own Sunni hegemonic plans. The Saudi establishment has focused on the removal of the Muslim brotherhood threat; now they will focus on limiting Shia Iranian influence by keeping crude prices low. Western companies made a beeline to Iran in 2014 in hope of an impending thaw; much will depend on the negotiation ability of the Rouhani establishment on the sanction front. Dubai and Israel remain insulated from the turmoil around and could reap the benefit of the uptick in the world economy. The risk of sudden flare-ups like the 2014 Gaza war continue to remain on the Israeli radar.
e. Asia and Australia:
The Asian political scene is remarkably stable with China, Japan and India looking inward to stabilize their economies under the leadership of Xi Jinping, Shinzo Abe and Narendra Modi, respectively. Some events have gone unnoticed by world media â for example, China starts the year of the goat as the world's largest economy when measured in PPP terms and for the first time ever, Chinese outbound investments could exceed those inbound. The establishment of China on the world stage has made Xi stronger than any Chinese leader in recent memory bar Chairman Mao himself. The Abe regime will continue on its reformist route of bringing Japan out of the deflationary zone, while winking at nationalist sentiment calling for a re-interpretation of the country's post-war pacifist role. Down south in India, Modi has surprised both supporters and detractors alike by his middle-path approach to reforming the economy and his zealous interest in foreign policy. While reforming cautiously, he has not removed the populist schemes of the previous government. 2015 will see him act unimpeded by local elections (other than in Bihar) and will prove to be a litmus test of his claims of good governance.
Afghanistan under Ashraf Ghani will face more trouble from Taliban as US adopts the Pakistani classification into good versus bad Taliban. In nearby Pakistan, the wildly popular Imran Khan - with some help, perhaps, from the deep state â will challenge the established parties in their home turfs. In Indonesia, Jake Widodo has come to power with Imran Khan-type support amongst the youth, and he will be hard-pressed to implement his reformist agenda â including reducing fuel subsidies â amidst persistent opposition from entrenched interests. ASEAN will continue to slip on its stated intentions for closer cooperation. Australia will try to balance its strategic partnership with the United States with economic dalliances with the Chinese.
f. Europe and Russia:
Vladimir Putin will be emboldened by the short-term rise in domestic popularity; and hence ignore the longer-term implications of his intervention in Ukraine. Tighter coupling with Kazakhstan and Belarus will not prevent what is likely to be a low-growth and high-inflation year for the Russians. Europe as a whole continues to underperform, and it will be most visible in France and Italy both of whom might record less than 1% growth in GDP. With the Trierweller-Gayet saga behind his back, Francois Hollande will attempt to rein in a deficit running at close to 4% of GDP. Even with help from ECB's quantitative easing program, there is little expectation that Hollande can avoid being the most unpopular leader amongst all western democracies. In Italy, high debt and unemployment â exemplified by the statistic of four-fifths of Italians between the ages of 20-31 living with parents â will hamper any efforts Matteo Renzi might take to pull the economy out of its doldrums.
The Greeks might look forward to a better year, especially when juxtaposed against their recent past. On the back of painful reforms, the Greek economy is widely anticipated to commence its long journey back to health, though there might be recurrent political scares and recalcitrant rumors of a Greek exit. The German government will be buffeted by opposing demands â external calls for a more interventionist role in stabilizing the world economy and internal ones for tempering the same. Cautious progress on the fiscal front will lead to modest GDP growth. Ironically, the European nations with best GDP growth projections are also the ones with the highest exposure to Putin's misadventures, viz. Poland, Latvia and Lithuania.
Sectors and segments:
Having dropped significantly in the past few months, the level of oil prices affects the prospects for many industry sectors in 2015.Â Oil is typically expected to revert to the mean because a lower oil price has discernible impact on both supply (by discouraging investment in its production and distribution) and demand (by boosting economic activity) sides. The speed of such mean-reversion remains unclear. Russia, Iran and US shale producers (esp. those who are not based at strategic locations) suffer disproportionally more than the Saudi establishment at current price levels. Lower oil prices will provide a fillip to consumer discretionary industries and airlines; and have an adverse impact on railroad (benefiting from oil transportation) and petrochemical companies. The shale gas boom - apart from increasing housing activity - is also the prime driver behind growth in the US steel and construction material sectors; consequently both the steel and construction sectors will remain susceptible to crude movements.
Low interest rates and low macro-growth prospects will induce companies with excess cash to acquire other companies to report earnings growth. That trend will be apparent in companies transacting in sectors as diverse as healthcare, industrials, semiconductors, software and materials. On another side of investment banks, trading desks will see higher market volatility as major powers pursue divergent paths to monetary policy (e.g. US against EU/Japan). In US, regulatory obligations increasing cost of capital for holding certain securities might lead to decreased broker liquidity. 2015 shall see the big banks grapple with the regulations in Basel III and Volcker; one expects regulatory push towards vanilla deposit-taking and lending to continue. Analysts will hope that stronger balance sheets coupled with a return to profitability lead to increased dividend payout for investors in financial stocks. China will seek to tame its overheated financial sector amidst a structural slowdown, and India will see RBI governor Raghuram Rajan continue his battle against political interference in corporate lending. Wealth management services will perform remarkably well not only in China, but also to a lesser extent in US as a rising market creates wealth and a retiring baby-boomer crowd seeks to couple low risk with acceptable return. In the arena of mobile payment, Apple Pay will try to avoid the lackluster performance of earlier attempts like Google Wallet.
Lower gasoline prices and an accompanying increase in disposable income (through wealth creation at the markets, increased home values, reduced unemployment and improved economic activity) creates a positive outlook for the consumer discretionary sector. Companies dealing with organic farming benefit from increased health consciousness; the market for yoga will continue to rise as 2014 saw the UN declare a world yoga day on Modi's initiative. Even as DVDs and Blue-rays fall, digital film subscriptions and on-demand internet steaming will rise to please Hollywood. Bollywood will get over its obsession with INR 100 crore revenues as movies will cross that level more frequently.Â With supply level of hotels remaining the same as few years back, revenue per room will rise across the sector. Tighter access to credit continues to hamper the rise in existing house sales, which nevertheless should improve over the past year. Asian apparel manufacturers continue to improve their market share in the fast fashion market.Â October 2015 will see Europeans benefit from the eCall service in all their new cars, which allows a car to immediately report details to the base-stations on any accident. New carbon-emission standards also come into force in Europe; even elsewhere the move towards higher efficiency in cars will continue. Widodo will be pleased at the growth in automobile sales in Indonesia, which should exceed those of other major markets. Internet advertising is rising faster than television commercials, though 2015 will still see the latter dominate the former in overall revenue generated. Privacy concerns continue to erode on the social media front. The newspaper industry will see increased number of advertorials re-packaged as "native advertising" by which companies will pay for advertisements to be written as paid newspaper article.
In India, the BJP government is yet to clarify its position on foreign direct investment in retail. Irrespective of its final decision, retail sales should surge sharply upward there as the consummation of pent-up demand of past few years couples with the thriving of 'mall culture' in middle-tier cities. China will also see an increase in retail sales inspite of its investigation in to WalMart. The anti-corruption campaign though will negatively impact luxury good sales as well as those of higher-end automobiles there. A strong dollar will affect US companies with significant operations abroad. Wheat production might match 2014 record volumes in Europe; though more newsprint will probably be devoted to higher prices of cocoa from Ivory Coast. Idiosyncrasies of local markets will shine as Dubai invests in large-scale brick and mortal malls, while Manhattan gets more of its groceries delivered at home steps.
Demand for energy should rise at the same pace as the world GDP next year. Analysts will point at attractive valuations of oil companies. If shale price remains attractive, Sabine Pass in Louisiana will emerge as the first plant in US to export LNG. Four years after the Fukushima incident, Japan will see nuclear reactors back in operation at Sendai.
2014 saw the denizens of the developed world fret about Ebola, breast cancer (through a campaign by actor Angelina Jolie) and ALS (through the ice bucket challenge). Overall, health spending will comfortably outpace the rate of growth of the overall economy. Long-term secular trends driving this are the aging population in the western world (with the population pyramid replaced by a population dome) and an emerging middle class elsewhere with increasing demand for improved access to healthcare. Universal healthcare has been promised for all in India, which should drive up healthcare expenditure by a significant amount there. In 2015, large US companies are mandated under Obama-care to provide insurance to more than 70% of their eligible workforce. Uncertainty on US healthcare reform and debate thereon may cause short-term price volatility. Millennial Development Goals will reviewed by the UN later in the year with a new set of goalposts announced for countries to be met by 2030; different NGOs will campaign vigorously through media to get their pet agendas included in the final list.
Transportation companies will report higher earnings from increased economic activity. Apart from some airlines which have suffered reputation damage through recurring accidents, airline companies will benefit from the reduced oil prices. Defense industry will see robust growth in China, as "Chi-America" remains no more a chimera. Alarmed by this increase, Vietnam with Philippines will move within the US ambit and Australia will seek to join the tripartite naval exercises in the Indian Ocean between US, Japan and India. Tensions in Eastern Europe and the middle-east will favor increases in expenditure across the region. The nationalist government in India will increase defense expenditure sharply even as it moves beyond lip-service on the long-standing issue of indigenization of defense manufacturing.
The mantra of social-local-mobile (SoLoMo in tech jargon) continues to drive the consumer markets division of information technology companies. Expenditure on IT hardware is significantly retarded by the increasing move to cloud computing. The move to cloud computing - along with increasing use of mobile commerce - bodes well for the computer security business. India should see a sharp increase in smart phone adoption; elsewhere tablet computers will rise against laptop and desktops. Embedded systems coupled with rudimentary networking will be marketed as an all-encompassing internet of things as the era of big data continues.Â Today, a single family in US places more demands on data flow than the entire planet did a decade back; and even this data rate is expected to increase by a whopping 70% over the next year. Consolidation in the cable sector (e.g Comcast with Time Warner Cable) and the convergence of content with distribution (e.g. AT&T with DirectTV) are two trends that should continue on from 2014. Even as Indians will talk about 3G coverage spanning the nation; Americans will tweet about 4G price warfare and the Chinese will see ZTE unveil a 5G prototype. Facebook will have more users than China has human beings. Analysts will harp about impact of interest-rate hikes on high dividend paying telecom stocks. Apart from the financial industry, telecom will emerge as an industry most impacted by federal regulation across the globe.
The anthropologist Edward Weyer once compared the future to being akin to a "corridor into which we can see only through the light coming from behind".Â It is in that sense that we have analyzed the data of the bygone year and tried to extrapolate into the days and months ahead. And when some are falsified - and falsified, some will be - then we shall lay credit for the same at the feet of those responsible - viz. us, the people.
[The authors are based in New York City, and can be contacted through email at firstname.lastname@example.org and email@example.com. The views represented above are personal and do not in any manner reflect those of the institutions affiliated with the authors.]
When I read the blogs and articles about Big Data, the subtext is always Big Money. Even though many of the tools themselves are open-source, they all seem to require Big Infrastructure and a horde of lavishly-paid consultants and brainiacs to get deployed. It's hard to read a "big data" story about anywhere where people appeared to pay attention to how much things cost.
This is rather frustrating. While it's interesting in a sort of Hollywood-gossip way to read about how Megacorp or a multibillion-dollar "startup" deployed a zillion-rack Hadoop server farm, cooled by Icelandic glaciers, and baby-sat by a thousand people all over the world, in order to mine fascinating new ways to better separate customers from their cash, it doesn't help me much here in reality-land. Here, I'm "the guy", and we have some money to play with, but not an enormous amount - and we watch every penny.
Fortunately, I have a few tricks up my sleeve, and I'd love to learn more. But I wish the database world would lose its fascination with big-data porn and have some real life examples of how people are solving big-data problems with real-life budgets and personnel constraints.
Many database schemas have similar characteristics, and one common - and important - schema type is what I call the "telemetry" schema. Telemetry schemas have certain things in common:
1. They have a small number of relatively simple tables. I'll call these the "telemetry log tables". 2. They have a time component, and are often queried using the time component. 3. They have at least one score or amount associated with the data, which is also queried frequently. 4. Telemetry log tables are often very large, with hundreds of millions or billions of records. 5. While there may be summary tables that are updated frequently, records in the telemetry log tables are rarely or never updated.
Examples of telemetry schemas include:
1. Actual telemetry data from various types of sensors. 2. Wall street-style trading data. 3. Other transactional data, such as banking activity, point-of-sale activity, etc.
As mentioned above, telemetry schemas often have a time series component, but also need to be queried in interesting ways using approaches other than the simple time component.
Telemetry schemas pose several challenges to standard schema design strategies:
1. Telemetry log tables are typically too large for relational joins, even when well-indexed, to perform well, particularly on "bulk" searches that visit many records. 2. Most database engines will "anchor" a search on one index, use it to fetch records out of the base table, and finish qualifying the search using values from these records, even if other indexes exist on the qualifying rows. This search strategy will perform awfully with huge tables, unless the engine gets lucky and picks a highly selective index, or the search happens to be on a very selective value in the index. Statistics-based query optimizers can help some, but can still "whiff badly", and this can result in what I call the database "death penalty" for queries: a badly-optimized "loser" query that takes days to run.
In a future blog post, I'll talk about a search strategy I've successfully used for a large telemetry schemas.
PRIMARY KEYs in InnoDB are the primary structure used to organize data in a table. This means the choice of the PRIMARY KEY has a direct impact on performance. And for big datasets, this performance choice can be enormous.
Consider a table with a primary search attribute such as "CITY", a secondary search attribute "RANK", and a third search attribute "DATE".
A simple "traditional" approach to this table would be something like
create index lookup_index
on myinfo (city, rank, info_date);
InnoDB builds the primary table data store in a B-tree structure around "id", as it's the primary key. The index "index_lookup" contains index records for every record in the table, and the primary key of the record is stored as the "lookup key" for the index.
This may look OK at first glance, and will perform decently with up to a few million records. But consider how lookups on myinfo by a query like
select * from myinfo where city = 'San Jose' and rank between 5 and 10 and date > '2011-02-15';
are answered by MySQL:
1. First, the index B-tree is walked to find the records of interest in the index structure itself.
2. Now, for every record of interest, the entire "primary" B-tree is walked to fetch the actual record values.
This means that N+1 B-trees are walked for N result records.
Now consider the following change to the above table:
create table myinfo (city varchar(50),
primary key (city, rank, info_date, id)
) engine=innodb; create index id_lookup on myinfo (id);
The primary key is now a four-column primary key, and since "id" is distinct, it satisfies the uniqueness requirements for primary keys. The above query now only has to walk a single B-tree to be completely answered. Note also that searches against CITY alone or CITY+RANK also benefit.
Let's plug in some numbers, and put 100M records into myinfo. Let's also say that an average search returns 5,000 records.
Schema 1: (Index lookup + Primary Key lookup from index):
Lg (100M) * 1 + 5000 * Lg (100M) = 132903 B-tree operations.
Schema 2: (Primary Key lookup only):
Lg(100M) * 1 = 26 B-tree operations. (Note that this single B-tree ingress operation will fetch 5K records)
So, for this query, Schema 2 is over 5,000 times faster than Schema 1. So, if Schema 2 is answered in a second, Schema 1 will take nearly two hours.
Note that we've played a bit of a trick here, and now lookups on "ID" are relatively expensive. But there are many situations where a table identifier is rarely or never looked up, but used as the primary key as "InnoDB needs a primary key".
One of the biggest problems with application development in the context of "Big Data" is that the developer gets the database-interacting code to "work" in the developer's "playpen" database, but the code collapses when it's put into production. A related, but even more serious problem - since it won't be found as quickly - is code that works early, but has very bad degradation as the production database grows.
There's a couple approaches to this problem:
1. Have the typical "small database" for initial coding and debugging.
2. Have a large developer playpen database. It should be a large fraction of the size of the production database, or if the production database is small, it should contain contrived (but logically consistent) data that is a large fraction of the expected size of the production database.
Developers should unit-test against both databases before committing their changes.
More pieces of data have been produced in the last five years than in all of human history put together before then. But what's driving this big data revelation? We'll discover what opportunities it opens up, and we'll uncover the pitfalls we might be facing. Plus, news that scientists uncover the first water on Earth, and we talk to the team who raced a solar powered car 3,000 kilometres across Australia...
An infographic from Aureus about the top trends in big data, analytics and analytical models that are defining trends of analytics today and are likely to have a significant influence in 2014 and future course around the globe.
This week, broadcasting live from the centre of Cambridge, the Naked Scientists delve into the digital age we live in. We look at new, exciting ways to get kids into coding, how big data is changing the world of healthcare, and we take to skies to go drone racing. But what are the problems we face in this technological age? We find out who is using our online data, and explore the dangers of connecting to public Wi-Fi...
On 13-14 June, Stockholm will be the best place in Europe to discuss Multi-core, Big Data, Cloud, Embedded, NoSQL, Mobile and the Future of the Web. The Erlang User Conference 2013 features over 40 speakers including top experts such as the inventors of Erlang Mike Williams, Robert Virding and Joe Armstrong, the author of Yaws [...]
Ab sofort steht Ausgabe 02/2013 des eStrategy-Magazins unter www.estrategy-magazin.de wieder kostenlos zum Download bereit. Wie in den vergangenen Ausgaben konnten wir vom eStrategy-Team auch in der aktuellen Ausgabe des Magazins wieder jede Menge Fachwissen rund um E-Commerce und Online-Marketing auf Ã¼ber 120 Seiten bÃ¼ndeln.
Der Schwerpunkt in Ausgabe 02/2013 liegt im Bereich Big Data. Hierzu liefert u.a. das Fraunhofer-Institut fÃ¼r Intelligente Analyse- und Informationssysteme IAIS einen umfassenden Artikel, welcher die Potentiale und MÃ¶glichkeiten von Big Data beschreibt und analysiert.
ErgÃ¤nzt wird das Ganze durch viele weitere spannende Themen aus der Onlinewelt: Neben Big Data als Hauptthema konnte ein Expertenartikel aus dem Hause eBay gewonnen werden, der sich mit dem Thema âCommerce Revolutionâ befasst. AuÃerdem fÃ¼hrte die eStrategy-Redaktion ein Experteninterview mit Zappos.com, das einen Blick hinter die Kulissen des erfolgreichsten Online-SchuhhÃ¤ndlers ermÃ¶glicht. Zudem wird in der neuen Ausgabe das Trendthema âSharing Economyâ genauer unter die Lupe genommen und Spezialisten geben ihr Know-How zu aktuellen Themen rund um Google Shopping, Future Commerce, Usability, etc. preis.
Die Themen der aktuellen eStrategy-Ausgabe im Ãberblick:
Big Data und Customer Journey â TrÃ¤ume werden wahrâ¦
In Deutschland wird das Potenzial von Webshops nach wie vor verkannt. Geht nicht gibtâs nicht: Wie Hersteller und HÃ¤ndler nicht nur in der Nische punkten kÃ¶nnen
Sharing Economy â das neue Zeitalter des Social Webs
Tante Emma 2.0 oder Big-Data
Emotional Usability â mit Herz und Verstand erfolgreich im E-Commerce
Big Data â Vorsprung durch Wissen. Mit neuen Technologien zum datengestÃ¼tzten âTante-Emma-Ladenâ
eBay: Commerce Revolution
Big Data â Modebegriff oder Trend?
Zappos.com â Ein Blick hinter die Kulissen eines der erfolgreichsten Online-HÃ¤ndlers
Google Shopping â Produktdaten als kritischer Erfolgsfaktor
Online Marketing Intelligence (OMI)
Google AdWords 2.0 â Erweiterte Kampagnen und deren MÃ¶glichkeiten
Wie man mit guten Daten SEO steuert
Rechtliche TÃ¼cken des Mobile Advertising
Big Data â zwischen Urheberrecht und Datenschutzrecht
Open Source Software: Rechtliche Grundlagen sowie Chancen & Risiken fÃ¼r Unternehmen
Die aktuelle Ausgabe des eStrategy-Magazins wird auch dieses mal wieder mit Buchempfehlungen sowie spannenden Surftipps abgerundet.
FÃ¼r wen ist das Magazin gedacht? Das Magazin wird Ã¼berwiegend von Shop- und Website-Betreibern, Agenturen, Unternehmensberatungen sowie IT- und Marketing-Verantwortlichen gelesen und ist natÃ¼rlich auch fÃ¼r alle anderen, die an den Themenbereichen E-Commerce, Online-Marketing, Webentwicklung und Mobile interessiert sind, gedacht.
Ausblick Die Ausgabe 03 / 2013 wird am 10. September erscheinen. Die Themenplanung fÃ¼r die kommende Ausgabe lÃ¤uft bereits und Gastautoren kÃ¶nnen sich gerne mit ThemenvorschlÃ¤gen unter firstname.lastname@example.org an die eStrategy-Redaktion wenden. Zudem freuen wir uns Ã¼ber Feedback und Kontaktanfragen fÃ¼r Partnerschaften.
We went to London for the Europe’s Customer Festival and we brought back 7 takeaways around the themes of the conference: Loyalty (understand, engage and retain your customers), Big Data (gain insights by understanding behavior), Omni Channel (create a seamless experience online and offline) and Total Payments (the moment of payment is critical to a customer experience strategy).
RANDA Solutions announces participation in a panel at this year's SITE 26th Annual Conference in Las Vegas featuring an emphasis on big data as three experts in education information technology present "Research on Implementing Big Data". The panel surveys technologies, processes and change management dynamics of implementing big data initiatives.
This post is the first of a series of posts related to Big Data, since I thought it was worth going in-depth with this topic. Big Data is a big word, a big buzzword, some might even call it big bullshit, since many components revolving around Big Data, and especially the ones on the analytics/methodology ...
A few months ago I decided to join the party and pickup a Raspberry Pi. It's a $25 full fledged ARM based computer the size of a credit card. There's also a $35 version, of which I ended up buying a handful so far. Due to the cost, this allows you to use a computer in dedicated applications where it otherwise wouldn't be justified or practical. Since then I've been pouring over the different things people have done with their Pi. Here are some that interest me:
Setting up security cameras or other dedicated cameras like a traffic cam or bird feeder camera
Since the Pi runs an ARM based version of Linux, I'm already familiar with practically everything on that list. The OS I've loaded is Raspbian, a Debian variant. This makes it a lot easier to get up and running with.
After recently divesting myself of some large business responsibilities, I've had more personal time to dedicate to things like this. Add in the vacation I took during Christmas and New Years and I had the perfect recipe to dive head-first into a Pi project. What I chose was something that I've always wanted.
The database and Big Data lover in me wants data, lots of it. So I've gone with building a black box for my car that runs all the time the car is on, and logs as much data as I can capture. This includes:
Once you've got a daemon running, and the inputs are being saved then the rest is all just inputs. Doesn't matter what it is. It's just input data.
My initial goal is to build a blackbox that constantly logs OBD2 data and stores it to a database. Looking around at what's out there for OBD2 software, I don't see anything that's built for long term logging. All the software out there is meant for 2 use cases: 1)live monitoring 2)tuning the ECU to get more power out of the car. What I want is a 3rd use case: long term logging of all available OBD2 data to a database for analysis.
In order to store all this data I decided to build an OBD2 storage architecture that's comprised of
JSON + REST web services API
SDK that existing OBD2 software would use to store the data it's capturing
Wrapping up existing open source OBD2 capture data so it runs as a daemon on the Pi
Logging data to a local storage buffer, which then gets synced to the aforementioned cloud storage service when there's an internet connection.
Right now I'm just doing this for myself. But I'm also reaching out to developers of OBD2 software to gauge interest in adding this storage service to their work. In addition to the storage, an API can be added for reading back the data such as pulling DTS (error) codes, getting trends and summary data, and more.
The first SDK I wrote was in Python. It's available on GitHub. It includes API calls to register an email address to get an API key. After that, there are some simple logging functions to save a single PID (OBD2 data point such as RPM or engine temp). Since this has to run without an internet connection I've implemented a buffer. The SDK writes to a buffer in local storage and when there's any internet connection a background sync daemon pulls data off the buffer, sends it to the API and removes the item from the buffer. Since this is all JSON data and very simple key:value data I've gone with a NoSQL approach and used MongoDB for the buffer.
The API is built in PHP and runs on a standard Linux VPS in apache. At this point the entire stack has been built. The code's nowhere near production-ready and is missing some features, but it works enough to demo. I've built a test utility that simulates a client car logging 10 times/second. Each time it's logging 10 different PIDs. This all builds up in the local buffer and the sync script then clears it out and uploads it to the API. With this estimate, the client generates 100 data points per second. For a car being driven an average of 101 minutes per day, that's 606,000 data points per day.
The volume of data will add up fast. For starters, the main database table I'm using stores all the PIDs as strings and stores each one as a separate record. In the future, I'll evaluate pivoting this data so that each PID has it's own field (and appropriate data type) in a table. We'll see which method proves more efficient and easier to query. The OBD2 spec lists all the possible PIDs. Car manufacturers aren't required to use them all, and they can add in their own proprietary ones too. Hence my ambivalence for now about creating a logging table that contains a field for each PID. If most of the fields are empty, that's a lot of wasted storage.Â
Systems integration is much more of a factor in this project than coding each underlying piece. Each underlying piece, from what I've found, has already been coded somewhere by some enthusiast. The open source Python code already exists for reading OBD2 data. That solves a major coding headache and makes it easier to plug my SDK into it.
There are some useful smartphone apps that can connect to a Bluetooth OBD2 reader to pull the data. Even if they were to use my SDK, it's still not an ideal solution for logging. In order to log this data, you need a dedicated device that's always on when the car's on and always logging. Using a smartphone can get you most of the way there, but there'll be gaps. That's why I'm focusing on using my Pi as a blackbox for this purpose.
19 Reasons why Democrats will remain divided - and what it means for the party's future.
Throughout most of the 2016 presidential primaries, the media focused on the noisy and reactionary rift among Republicans. Until the battle between Hillary Clinton and Bernie Sanders turned acrimonious in the home stretch, far less attention was paid to the equally momentous divisions within the Democratic Party. The Clinton-Sanders race wasnât just about two candidates; instead, it underscored a series of deep and growing fissures among Democrats, along a wide range of complex fault linesâfrom age and race to gender and ideology. And these disagreements wonât fade with a gracious bow-out from Sanders, or a victory in November over Donald Trump. For all the talk of the Democratsâ need for âunity,â it would be a serious mistake to paper over the differences that came to the fore in this yearâs primaries. More than ten million Democrats turned out in force this year to reject the party establishmentâs cautious centrism and cozy relationship with Wall Street. Unless Democrats heed that message, they will miss a historic opportunity to forge a broad-based and lasting liberal majority.
To help make sense of whatâs causing the split, and where itâs headed, we turned to 23 leading historians, political scientists, pollsters, artists, and activists. Taken together, their insights reinforce the need for a truly inclusive and vigorous debate over the partyâs future. âThere can be no settlement of a great cause without discussion,â observed William Jennings Bryan, the original Democratic populist insurgent. âAnd people will not discuss a cause until their attention is drawn to it.â
It goes way, way back
BY RICK PERLSTEIN
The schism between Hillary Clinton and Bernie Sanders is knit into the DNA of the modern Democratic Party, in two interrelated ways. The first is ideological: the battle of left versus right.
Start in 1924, when the party cleaved nearly in two. That year, at Madison Square Garden, the Democratic convention took a record 103 ballots and 16 days to resolve a fight between the partyâs urban wing and its conservative opponents. How conservative? Well, the convention was nicknamed the âKlanbake,â because one of the great issues at stake wasâno kiddingâwhether the KKK was a good or a bad thing. The divide was so heated that tens of thousands of hooded Klansmen held a rally and burned crosses to try to bully the party into meeting their demands.
Eight years later, under Franklin Roosevelt, the partyâs urban, modernist wing established what would become a long hegemony over its reactionary, Southern one. But that hegemony remained sharply contested from the very beginning. In 1937, bipartisan opponents of FDR banded together to forge the âConservative Manifesto.â Co-authored by a Southern Democrat, the manifesto called for lowering taxes on the wealthy, slashing government spending, and championing private enterprise. Hillary Clintonâs eagerness to please Wall Street can be traced, in part, to that ideological split during the New Deal.
Indeed, over the years, many of the most âliberalâ Democrats have remained sharply conservative on economic questions. Eugene McCarthy, the âpeacenikâ candidate of 1968, ended up backing Ronald Reagan. Dan Rostenkowski, the lunch-pail chairman of the House Ways and Means Committee, proposed a tax package in 1981 that was more corporate-friendly than Reaganâs. Jerry Brown of California, long derided as âGovernor Moonbeam,â campaigned for president in 1992 on a regressive flat tax. That same year, Bill and Hillary Clinton won the White House with the business-funded support of the Democratic Leadership Council, which sought to downplay the âbig governmentâ solutions championed by FDR.
Which brings us to the second strand in the partyâs divided DNA: Itâs sociological.
Slateâs Jamelle Bouie has pointed out the patternâs clocklike consistency: Since the beginning of the modern primary process in 1972, the Democratic divide has settled into a battle between an âinsurgentâ and the âestablishment.â But Bouie errs, I think, in labeling every insurgent as âliberal.â Just look at Brown in 1992âan insurgent who was conservative on economic issues. Or Hubert Humphrey in 1968 and 1972âan establishment favorite whose signature legislative initiatives, including centralized planning boards to dictate industrial production, were more socialist than those of Sanders.
This year, however, the traditional order of battle aligns with crystalline precision. Clinton, endorsed by 205 out of 232 Democratic members of Congress, is clearly the establishmentâs pickâand also, increasingly, that of Wall Street masters of the universe terrified by the prospect of Donald Trump. Sanders represents the guerrilla faction, arrayed this time behind the economically populist banner of FDR.
Does history tell us anything about how Democrats can bridge their long-running divide and forge a stronger, more unified party? Sanders would do well to remember that sore loserdom never helps. (âGeorge McGovern is going to lose,â a leading Democrat supposedly vowed after Humphrey lost the nomination in 1972, âbecause weâre going to make him lose.â) And Clinton needs to recognize that campaigning on economic liberalism is almost always a good political bet. (Even at the height of Reaganâs morning-in-America blather in 1984, barely a third of American voters favored his plans to reduce the deficit by slashing social programs.)
If Hillary has any doubts about embracing the economic agenda laid out by Sanders, she should ask the insurgent of 1992: William Jefferson Clinton. The man who ended a dozen years of presidential exile for the Democrats didnât do it simply by promising to get tough on crime and to âend welfare as we know it.â He also pledged $80 billion in federal investments to improve Americaâs cities and to create four million new jobsânot to mention, of course, a plan to deliver health care to all Americans.
Itâs Obamaâs fault for raising our hopes
JACOB HACKER, PROFESSOR OF POLITICAL SCIENCE AT YALE AND CO-AUTHOR OF WINNER-TAKE-ALL POLITICS: Weâve now had almost eight years of a Democratic presidency. And with the exception of the policy breakthroughs in 2009 and 2010, theyâve been viewed as relatively lean years by many in the Democratic Party. Thereâs a sense of, âWe went with someone within the system, and look what happenedâRepublicans still tried to crush that person. So letâs go for the whole thing.â Thereâs a sense that supporting the Democratic establishment and going the conventional route hasnât been that productive.
MYCHAL DENZEL SMITH, AUTHOR OF INVISIBLE MAN, GOT THE WHOLE WORLD WATCHING: A lot of young people who showed up to vote for Obama were voting for the very first time. But now theyâre looking at the ways economic inequality persists, and theyâre saying, âOh, the Democratic Party doesnât actually stand against that.â Theyâre looking at the deaths of Trayvon Martin and Michael Brown, the two big linchpins in the Black Lives Matter movement, and theyâre like, âOh, Democrats are actually the architects of the policies that have affected and continue to define young black life in terms of systemic, institutionalized racism.â So you have young folks getting into the Democratic Party and realizing they donât have a place.
ASTRA TAYLOR, AUTHOR OF THE PEOPLEâS PLATFORM: TAKING BACK POWER AND CULTURE IN THE DIGITAL AGE: This is in part a symptom of the expectations that people had for the Obama administration that werenât met. It got its first major expression through Occupy Wall Street, and itâs still playing out. Because nothing has changed, and people know that.
RUY TEIXEIRA, CO-AUTHOR OF THE EMERGING DEMOCRATIC MAJORITY: You can make the case that Obama has been a very successful and progressive president, but people are impatient. What used to keep people in line, so to speak, when they had these kinds of dissatisfactions was, âOh, Iâm really frustrated, but what can we do? The country is so right-wing. Weâve got to worry about the national debtâthereâs no room in the system for change.â Now thereâs much more of a sense of possibility. The Democratic Party has contributed to this transformation by becoming more liberal, and by ceasing to be obsessed with the national debt and the deficit.
ELAINE KAMARCK, SENIOR FELLOW AT THE BROOKINGS INSTITUTION AND AUTHOR OF PRIMARY POLITICS: Hereâs the ironyâthe Bernie people are the Obama people. Theyâre all the young people; thatâs the Obama coalition. Theyâre frustrated because under Obama, nothing much happened that they liked. Theyâre taking it out on Hillary, which is unfortunate, since sheâs much more capable of making something happen.
JEDEDIAH PURDY, PROFESSOR OF LAW AT DUKE AND AUTHOR OF AFTER NATURE: The disappointment in Obama took a while to set in. The Obama campaign had the form and rhetoric of transformative politics, but not the substance. Many of us believed or hoped the substance might follow the form; but it didnât. It turns out you need a program that challenges existing power and aims to reshape it. So Sanders represents the continuation of these insurgent energies. Clinton is also the continuation of Obama, but the Obama of governance, not of the campaign.
Itâs Hillaryâs fault for lowering our hopes
Ron Haviv / VII for the New Republic
JOHN JUDIS, FORMER SENIOR EDITOR AT THE NEW REPUBLIC AND CO-AUTHOR OF THE EMERGING DEMOCRATIC MAJORITY: In 1984, you had Walter Mondale, a candidate of the Democratic establishment, pitted against a young upstart, Gary Hart. The split wasnât left-rightâit was young-old, energetic-tired, vision-pragmatism. Bernie, for all his 74 years, represents something still of the rebellious Sixties that appeals to young voters, while Hillary represents a tired incrementalismâutterly uninspiring and rooted largely in identity politics and special interest groups, rather than in any vision for the future.
The party hasnât kept up with its base
JILL FILIPOVIC, LAWYER AND POLITICAL COLUMNIST: The party itself has been stuck in some old ideas for a while. Youâve been seeing movement around the edges, whether from Elizabeth Warren or these grassroots movements for income inequality. The pro-choice movement, for example, is a key part of the Democratic base that has liberalized and modernized and completely changed its messaging in a way that the party is now just catching up to. So you get these internal discords that dredge up a lot of bad feelings.
DANIELLE ALLEN, DIRECTOR OF THE EDMOND J. SAFRA CENTER FOR ETHICS AT HARVARD: In the last 20 years, weâve collectively experienced various forms of social acceleration. Rates of change in social dynamics have increased across the spectrum, from income inequality to mass incarceration to immigration to the effects of globalization and the restructuring of the economy. When you have an acceleration of social transformation, thereâs a lag problem. The reigning policy paradigms will be out of sync with the actual needs on the ground. Thatâs what weâre experiencing now.
JEDEDIAH PURDY: The people who have been drawn to the Sanders campaign have no love for or confidence in elites, Hillaryâs habitus. And why should they? Theyâve seen growing inequality and insecurity, the naked corruption of politics by oligarchic money, total cynicism in the political class of consultants and pundits, and wars so stupid and destructive that Trump can say as much and win the GOP primaries. Thereâs a whole world that people are surging to reject.
Bernieâs supporters arenât living in reality
Mark Peterson / Redux
DAVID SIMON, CREATOR OF THE WIRE: I got no regard for purism. What makes Bernie so admirable is he genuinely believes everything that comes out of his mouth. Itâs incredibly refreshing. If he didnât have to govern with people who donât believe what heâs saying, what a fine world it would be.
I look at the hyperbole from Bernie supporters that lands on my doorstep. Either itâs stuff they believeâin which case theyâre drinking the Kool-Aid, so theyâre not even speaking in the vernacular of reality. Or what theyâre doing is venal and destructive. That level of hyperbole, which Bernie himself is not responsible for, is disappointing. The truth is, itâs not just your friends who have utility in politicsâsometimes itâs the people who are against you on every other issue. If you canât play that game, then what did you go into politics for?
THEDA SKOCPOL, PROFESSOR OF GOVERNMENT AND SOCIOLOGY AT HARVARD: A lot of Bernie supporters are upper-middle-class people. Iâm surrounded by them in Cambridge. Iâm not saying theyâre hypocritical. Iâm just saying theyâre overplaying their hand by celebrating his focus on reining in the super-rich as the only way that we can talk about improving economic equality.
ELAINE KAMARCK: This is part of a bigger problem with American presidential politics selling snake oil to the voters. Everybody from Trump with his stupid fething wall, to Sanders with, âOh, free college for everybody.â Of all the dumb thingsâletâs go ahead and give all the rich kids in America a nice break. Thatâs not progressive, Iâm sorry. But people want to believe in Peter Pan. And heâs just not there.
MARK GREEN, FORMER PUBLIC ADVOCATE OF NEW YORK AND AUTHOR OF BRIGHT, INFINITE FUTURE: A GENERATIONAL MEMOIR ON THE PROGRESSIVE RISE: Thereâs a lot of adrenaline in primaries between purity and plausibility. Sanders is the most popular insurgent in American history to get this close to a nomination, and to help define the Democratic agenda. I admire his guts to run in the first place, and I get why his combination of Bulworth and Eugene Debs makes him such an appealing candidate. But the programmatic differences between a walking wish list like Sanders and a pragmatic progressive like Clinton are dwarfed by the differences between either of them and the first proto-fascist president.
Thereâs a double standard against Hillary
JILL FILIPOVIC: The dovetailing of gender and wealth in this election is really striking. I donât remember a lot of Democrats ripping John Kerry to shreds for being wealthy when he ran for president. But itâs been interesting to see Clinton demonized for her Goldman Sachs speeches. For some Democrats, that seems to be inherently disqualifying. Obviously, money would be an issue even if she were a male candidate, because this is an election thatâs about income inequality. But the sense that sheâs somehow undeserving, that does strike me as gendered.
THEDA SKOCPOL: Older women support Clinton because theyâve witnessed her career, and sheâs always been into economic redistribution. Some Sanders followers have been quite sexist in things theyâve said; thatâs very apparent to older women. A friend who studies abortion politics tells me that the nasty tweets sheâs gotten from Bernie supporters for backing Hillary are worse than anything she gets from the right wing.
AMANDA MARCOTTE, POLITICS WRITER FOR SALON:What youâre seeing is a huge drift in the party, away from having our leadership be just a bunch of white men who claim to speak for everybody else. Weâre moving to a party that puts womenâs interests at the center, that considers the votes of people of color just as valuable as the votes of white people. Unfortunately, some of the support for Sanders comes from people who are uncomfortable with that change and are looking to a benevolent, white patriarch to save them.
ELAINE KAMARCK: Clinton is being penalized because she has a realistic view of what can be done, and that leads people to mistake her for some kind of bad conservative. Sheâs not. Sheâs extraordinarily liberal, particularly on children and families. But because sheâs been around a while, when Sanders comes out with this new radical stuff, they think, âOh, heâs the one whose heart is in the right place.â But listen, she took on Wall Street before he did, in a way that hit their bottom line. If people really want to get something done, theyâd vote for her.
MARK GREEN: Look, thereâs a debate I have with my friend Ralph Nader. He sees Hillary as more Wall Street, and I see her as more Wellesley. Sheâs as smart as anyone, grounded, practical, engaging, and unlike most testosterone-fueled male politicians, actually listens more than lectures. So sheâs not as dynamic a candidate as Bill and Barack? Who is? Thatâs an unfair comparison. But if I had to bet, Iâd guess sheâll be as consequential and good a president as either of them.
Poverty is fueling the divide
BY KEEANGA-YAMAHTTA TAYLOR
The Democratic Party today engages in delusional happy talk about economic recovery, while a staggering 47 million Americans are struggling in poverty. As the rich remain as wealthy as ever, working-class people continue to see their wages stagnate. In the 1970s, 61 percent of Americans fell into that vague but stable category of âmiddle class.â Today that number has fallen to 50 percent. African Americans, the core of the Democratic Party base, continue to be plagued by dead-end jobs and diminished prospects. Fifty-four percent of black workers make less than $15 an hour. Thirty-eight percent of black children live in poverty. More than a quarter of black households battle with hunger.
This is the heart of the crisis within the Democratic Party. Eight years ago, the party ran on hope: âYes, we canâ and âChange we can believe in.â Pundits openly wondered whether the United States was on the cusp of becoming a âpostracialâ nation; on the eve of Obamaâs first inauguration, 69 percent of black Americans believed that Martin Luther Kingâs âdreamâ had been fulfilled. Today, the tune is quite different: Millions of Americans are more disillusioned and cynical than ever about the ability of the state to provide a decent life for them and their families.
Bernie Sanders tapped into the palpable disgust at Americaâs new Gilded Age, and itâs a revulsion that will not be quieted with a few platitudes from Hillary Clinton to âgive the middle class a raise.â Yet the Democratic leadership continues to treat Sanders as an unfortunate nuisance. The party keeps charging ahead the way it always has, as Clinton pivots to her right to appeal to disgruntled Republican voters. As long as the party has no challengers to its left, the thinking goes, its base has nowhere else to go.
This strategy may lead Clinton to victory in November. But there is a danger here: In winning the battle, she very well may lose the war being waged within the Democratic ranks. The inattention to growing inequality, racial injustice, and deteriorating quality of life will likely result in ordinary people voting with their feet and simply opting out of the coming election, and future ones as well. Millions of Americans already do not vote, because most elected officials are out of touch with their daily struggles, and because there is little correlation between voting and an improvement in their lives. By continuing to ignore the issues Sanders has raised, Clinton and the rest of the party establishment risk losing a huge swath of the Democratic electorate for years to come.
There is a way out. More and more voters are identifying as independents. This demonstrates that people want real choicesâas opposed to politics driven by sound bites, political action committees, and billionaire candidates. The wide support for both Sanders and Trump points to the incredible vacuum that exists in organized politics. If the movements against police racism and violence were to combine with the growing activism among the disaffected, from low-wage workers to housing advocates, we could build a political party that actually represents the interests of the poor and working class, and leave the Democrats and the Republicans to the plutocrats who already own both partiesâ hearts and minds.
Itâs the economy, stupid
JOHN JUDIS: There have been insurgencies beforeâGeorge Wallace in â64 and â72âthat were radical. What made Wallace radical was the split in the party over civil rights. What makes Sanders radical is the lingering rage over the Great Recession.
If you want to move the question up a level theoretically, you can talk about the failure of ânew Democratâ politics to deliver prosperity or economy security. Clinton and the Democrats in Washington donât understand the level of anxiety that Americans, and particularly the young, feel about their economic prospects. It canât be addressed by charts showing the drop in the unemployment rate.
BRETT FLEHINGER, HISTORIAN AT HARVARD AND AUTHOR OF THE 1912 ELECTION AND THE POWER OF PROGRESSIVISM: The Democratic Party has done a poor job of delivering on the economic promises of equality. Thatâs whatâs opened up the possibility for Sanders. Itâs what heâs believed in for 20-plus years. But the question is: Whatâs making it resonate now? Itâs the failure of the party to liberalize, since Bill Clinton.
JACOB HACKER: Thereâs a feeling of, âReally? This is it? This is the recovery weâve been promised?â Itâs been a long, difficult path since 2008 and the financial crisis. Even Democratic voters who are doing pretty well are feeling that something has gone seriously awry.
This may be the first time in my life that thereâs been a full-throated critique of the Democratic Party as being excessively beholden to money and too willing to work within the system. You saw echoes of this in the Howard Dean campaign, and you saw it much more forcefully in 2000 with Ralph Nader. But Nader was not running within the Democratic Party; he was clearly playing a spoiler role. Whereas Sanders is essentially trying to take the Democratic Party in a different direction.
JEDEDIAH PURDY: Bernieâs campaign is the first to put class politics at its center. Not poverty, which liberal elites have always been comfortable addressing, and not âWe are the 99 percent,â which is populist in a more fantastical sense, but class more concretely: the jobs and communities of blue-collar people, the decline of the middle class, the cost of education.
MARK HUGO LOPEZ, DIRECTOR OF HISPANIC RESEARCH AT THE PEW RESEARCH CENTER: When you ask Clinton supporters, or people who see Clinton favorably, youâll find that more than half will say that, compared to 50 years ago, life is better in America today. Whereas among Sanders supporters, one-third will say that things are actually worse.
Democrats are too fixated on white workers
JILL FILIPOVIC: The class-based concerns that a lot of the loudest voices in the Sanders contingent of the Democratic Party focus on are the concerns of the white working class, and they arenât bringing a lot of race analysis into it. The income-inequality argument makes a case, particularly to the white working class, in a way that seems to have alienated African Americans and, to a lesser extent, the Hispanic vote.
MYCHAL DENZEL SMITH: Look at every demographic breakdown of who votes. The strongest Democratic Party voters are black women. So why is it that youâre so zeroed in and focused on regaining the white working-class vote? What value does that have to you, as opposed to appeasing the voters that are actually there for you? Democrats want it both ways. They want to attract the white working-class voter again, but what they donât accept is that the reason they lost that voter is because of Republican appeals to racism. So the Democrats want to be the party of anti-racism but also win back the racists. You canât do that! Why would you want a coalition of those people? It doesnât make sense.
Democrats have neglected white workers
DAVID SIMON: Thereâs certainly something unique about this moment, and the populist rebellion that has affected both the Republican and Democratic parties. And I think itâs earned. Both parties can be rightly accused, not to the same degree, of having ignored and abandoned the working class and the middle-middle class for the past 30 years.
Millennials of color are tired of waiting
ALAN ABRAMOWITZ, PROFESSOR OF POLITICAL SCIENCE AT EMORY AND AUTHOR OF The Polarized Public? Why American Government Is So Dysfunctional: Why are African Americans so loyal to the Clintons? Part of it is just familiarity. They feel a comfort level with the Clintons, and they really like Bill Clinton, especially older African American voters. But thereâs a generational divide even among African American voters. Younger African Americans and Latinos are not as supportive of Clinton.
MARK HUGO LOPEZ: I was in Chicago recently, and I was surprised when a young Latina college student stood up and described how much she did not like Clinton. She actually said, âI hate Hillary Clinton.â Thatâs the phrase she used, which drew a round of applause from everybody in the room.
JOHNETTA ELZIE, A LEADER OF BLACK LIVES MATTER: I donât think anyone was ready to deal with black millennials. I just donât believe that anyone in politics who is running on a national scale knows how to address young black or brown people in a way thatâs different from how they addressed our elders. Because weâre not the same.
I remember when Hillary got shut down by some young black students in Atlanta. They wanted to know, âWhat does she even know about young black people in this neighborhood and what we go through?â John Lewis basically told them, âYou need to wait to speak to Hillary. Just be polite, ask questions, yada yada.â And people were like, âBut you were a protester before you were a politician! You know what it is, you know the sense of urgency, you know what it means to be told to wait and to know that we donât have time to wait.â
MYCHAL DENZEL SMITH: Throughout our history, progressive movements have often left out the idea of ending racism. Then they go to communities of color and say, âWhat choice do you have but to join with usÂâto put aside your concerns about the differences that we experience in terms of racism?â In this election, the movement on the ground has at least pushed Democrats to adopt the language of anti-racism. Theyâve had to say things like âinstitutionalized racismââtheyâre learning the language on the fly. The problem is, they understand that they donât actually have to move on these issues, because they have Trump to run against. All they have to do is say, âLook at how crazy the other option is. Where else are you going to go?â
Authenticity is gender biased
BY RIVKA GALCHEN
Mark Peterson / Redux
In an early scene in Stendhalâs The Red and the Black, a carpenterâs son hired as a tutor for a wealthy family dons a tailored black suit provided by his new employer. The black suit was a new and radical thing in this era, one in which bakers dressed like bakers, nobility like nobility. In a black suit, oneâs social class was cloakedâa form of what back then was often termed hypocrisy.
Lately, as Iâve followed the contest between Hillary Clinton and Bernie Sanders, Iâve found myself thinking of The Red and the Black, and its play with antiquated notions of authenticity. The passionate support for Sanders has, one hopes, much to do with excitement about his insistent expression of a platform of economic populism. But it would be naÃ¯ve to think it doesnât also have to do with his appearance, his way of speaking. There is authenticity, and there is appearing authentic. These two things may mostly alignâas they largely but not entirely do with Sanders. (Most anti-establishment figures avoid 35 years in government.) Or they may almost perfectly not alignâas in the case of Donald Trump. (A liar celebrated for speaking the truth.) Either way, itâs worth investigating authenticity in our political thinking, both to understand its power and to consider how it helps or hurts the kind of effective, forward-looking agenda that we hope will emerge from a fractured Democratic Party.
One problem with authenticity as a campaign tactic is its unsettling, subconscious alliance with those who benefit from the status quo. If youâre not who you say you areâif youâre moving on the social ladder, or are not in âyour placeââyouâre inauthentic. Keeping it real subtly advocates for keeping it just like it is.
The semiotics of Sandersâs political authenticityâdishevelment, raised voice, being unyieldingâare available to male politicians in a way they are not to women (and to whites in a way they are not to blacks or Hispanics or Asians). Black women in politics donât have the option to wear their hair ânaturalâ; nearly all white women appear to have blowouts, even Elizabeth Warren. Itâs nonsense, and yet the only politically viable option, and therefore not nonsense.
Itâs not just that research has shown that women are perceived to talk too much even when they talk less, or that men who display anger are influential while women who do so are not. Itâs that there is no such thing as âmasculine wiles.â The phrase just doesnât exist. This doesnât mean that calling into question Clintonâs authenticity and trustworthinessâthe fault line along which the Democratic Party has rivenâis pure misogyny. It just means that itâs not purely not misogyny.
Clinton is often described as the institutional candidate, the establishment. Thereâs a lot of truth to that. But sheâs also the woman who initially kept her name (and her job) as the wife of the governor of Arkansas, who used the role of First Lady as cover to push for socialized health care, and who was instrumental in getting health insurance for eight million children past the Republican gorgons when a full reform failed. Someone who has survived being attacked for nearly 40 years must possess a highly developed sense of what the critic Walter Benjamin calls âcunning and high spiritsââthe means by which figures in fairy tales evade the oppressive forces of myth, and mortals evade gods. Somehow she achieved one of the more liberal voting records in the Senate, despite rarely being described as a liberal by either the left or the right.
Perhaps one reason that Clintonâs âfirewallâ of black support has remained standing is that âauthenticityâ has less rhetorical force with a historically oppressed people, for whom that strategyâbeing recognizably who people in power think you ought to beâwas never viable. There are, of course, important and substantial criticisms of Clinton. But perhaps when we say that Hillary is inauthentic, weâre simply saying that she is a woman working in the public eye.
Democrats on both sides of the party should consider which tactic best suits the underdogs they feel they are defending, and want to defend. Whoever receives the nomination, perhaps the worry should shift from whether the candidate is cunning to whether the candidateâand the Democratic Partyâcan be cunning enough.
The disruption is digital
BY ZEYNEP TUFEKCI
Insurgents like Bernie Sanders have been the rule, not the exception, in the modern era of Democratic politics. From Eugene McCarthy to Jesse Jackson, the partyâs left wing regularly broke ranks to run on quasi-social democratic platforms. But with the exception of George McGovern in 1972, these challengers all fell short of the nomination, partly because they lacked the money to effectively organize and advertise. The party establishment had a virtual monopoly on every political tool needed to win.
Slowly at firstâand then with a big, loud bangâdigital technologies changed all that. First came Howard Dean, who used the internet to âdisruptâ the Democratic Party in 2004. Powered by small online donations and digitally organized neighborhood âmeetups,â Dean outraised his big-money rivals and revolutionized the way political campaigns are funded. Four years later, Barack Obama added a digitally fueled ground game to Deanâs fund-raising innovations, creating a campaign machine that could identify and turn out voters with a new level of accuracy. But when Obamaâs policies fell short of the leftâs expectations, many turned their energies to building a different kind of digital rebellionâthis time, outside of electoral politics.
Sparked by a single email in June 2011, Occupy Wall Street exploded in a matter of months into a worldwide movement that mobilized massive street protestsâincluding many whoâd sworn off partisan politics as hopelessly corrupted. Occupy demonstrated how the masses could organize without a campaign or a candidate to rally around, opening a space that would soon be joined by Black Lives Matter and other activist groups. It also unleashed a populist fervor on the left. As the 2016 campaign approached, Occupy veterans joined forces with left-leaning activists inside the party. Instead of rejecting traditional politics, they decided to disrupt the Democratic primaries, the way Tea Party activists did to the GOP in 2010 and 2012.
In some ways, it didnât matter that Sanders was the candidate they rallied behind. His ideological consistency earned him the trust of the left, and they in turn stoked his online fund-raisingâproducing the flood of $27 average donations that kept him competitive with Hillary Clinton. In the spirit of Occupy, Sandersâs digital operation was more volunteer-driven and dispersed than Obamaâs; instead of âBig Data,â the watchword for Sanders was âBig Organizing,â as hundreds of thousands of volunteers effectively ran major parts of the show. A pro-Sanders Reddit group attracted almost a quarter-million subscribers, who helped organize everything from voter-registration drives to phone banks. A legion of young, pro-Sanders coders on Slack produced apps to mobilize volunteers and direct voters to the polls. There was even a BernieBNB app, where people could offer their spare couches to #FeelTheBern organizers.
Ultimately, the Sanders campaign became a lesson in both the potential and the limitations of a digitally fueled uprising. It seems miraculous that a 74-year-old democratic socialist could come so close to beating a candidate with Clintonâs institutional advantages. But Sandersâs superior digital reach couldnât help him win over African Americans and older women, most of whom favor Clinton. And all his fans on social media could not alter the mainstream mediaâs narrative that this was yet another noble but doomed insurgency.
Whether or not Clinton wins in November, itâs safe to expect another Democratic insurgency in 2020âand beyond. Digital fund-raising, organizing, and messaging have given the left the weapons not just to tilt at the establishmentâs windmills, but to come close to toppling them. Next time, they might just succeed.
Split? What split?
RUY TEIXEIRA: I donât see differences massive enough to provoke any kind of split that has serious consequences. Itâs just part of an ongoing shift in the Democratic Party. The party is going to continue to consolidate behind a more aggressive and liberal program, and the Sanders people are a reflection of that. We shouldnât lose track of the fact that Clinton will be the most liberal presidential candidate the Democrats have run since George McGovern.
BRETT FLEHINGER: In historic terms I donât think this party is split. I donât even think the divide is as big as it was in 2000, when a significant portion of Democratic voters either considered Ralph Nader or voted for Nader.
ALAN ABRAMOWITZ: Itâs easy to overstate how substantial the divide is. Some of it is more a matter of style, the sense that Clinton and some of these longtime party leaders are tainted by their ties to Wall Street and big money. But itâs not based so much on their issue positions, because Clintonâs issue positions are pretty liberal. Not as far left as Bernieâbut then, nobodyâs as far left as Bernie. Part of it is a distortion, because you canât get to Bernieâs left, except maybe on the guns issue. So Bernie can always be the one taking the purist position.
THEDA SKOCPOL: This isnât a revolution. The phenomenon of having a left challenger to somebody called an establishment Democrat goes way back. Itâs been happening my whole life, and Iâm not a child. Itâs never successful, except in the case of Obama. And Obama had something that the other challengers didnât: He was able to appeal to blacks. Most of these left candidates appeal to white liberals, and Sanders is certainly in that category. His entire base is white liberals.
KEVIN BAKER, AUTHOR OF THE NOVEL STRIVERS ROW: Democrats have almost always been the party that co-opts and brings in literal outsiders and outside movements. In the late nineteenth century, it was a bizarre coalition between Southern bourbon planters and big-city machines, which each had their own grievances. Then it was an uneasy coalition between those same machines and the agrarian populists brought in by William Jennings Bryan. Then you had the Grand Coalition, the biggest, most diverse coalition in American history, which was the New Deal one: farmers and workers, urbanites and Main Street progressives, blacks, whites, feminists, unionists. It lasted a long time, until it broke down over race and the Vietnam War in the 1960s. Finally, you had the rise of the Democratic Leadership Council and the Clinton-ite and Obama-ite version of more conservative progressivism. But what that coalition left unanswered, for a lot of people in the party and in the country, was just how they were going to make a living in this new world. What weâre seeing now is a very civil contest, relatively speaking, over who is going to lead that coalition.
Donât worry: Trump will unite us
Mark Peterson / Redux
JOHN JUDIS: Whatever shortcomings Clintonâs campaign has in creating unity are likely to be overcome by the specter of a Trump America.
RUY TEIXEIRA: I donât see the people who support Sanders, particularly the young people, as being radically different from the Clinton folks in terms of what they support. Theyâll wind up voting for Hillary when she runs against Trump.
DAVID SIMON: If youâre asking me if I think the Democratic Party will heal in the general election, I think it will. Trump helps that a lot. The risks of folding your arms and walking away are fundamental, in a way they might not be with a more viable and coherent candidate. But letâs face it, the idea of this man at the helm of the republic is some scary gak.
Bernie isnât the future, but his politics are
ALAN ABRAMOWITZ: Younger voters are the future of the Democratic Party. But Bernie Sanders is not the future of the Democratic Party. The question is: Whoâs going to come along who can tap into that combination of idealism and discontent that he represents?
JOHN JUDIS: Sanders is an old guy, like I am, and not one, I suspect, to build a movement. And I think âmovementâ is probably the wrong word. What inspires movements is particular causes (Vietnam, civil rights, high taxes) or a party in power that is seen as taking the wrong stance on those issues (George W. Bush for liberals, Barack Obama for Republicans). If Clinton is the next president, I donât expect a movement to spring up. Instead, Iâd expect to see caucuses within the party that take a Bernie Sanders/Elizabeth Warren point of view. But if Trump wins, you will see a movement, whatever Sanders does.
JACOB HACKER: Thereâs a growing chunk of the Democratic electorate that believes the existing policy ideas that define the mainstream of the party donât go far enough. The question becomes: What do those folks do after the election? What kind of force will they be within the party going forward? Can they form a strong movement that will press national politicians to move to the left, the way the Tea Party did on the right?
If a Democrat wins in November, you probably canât get a movement like the Tea Party under Obama, or Move On under Bush. But what you could getâwhat you would hope to getâis a true grassroots, longer-term movement that tries to move the center of gravity of American politics to the left.
JEDEDIAH PURDY: But what would a movement built out of Sanders supporters be for, exactly? The campaign itself gives some answers. The Sanders campaign is much more distinct from the Clinton campaign, in substance, than Obamaâs first campaign was. The Fight for $15, single-payer health care, stronger antitrust law, free college: These are huge, concrete goals. If people can organize around one guy who expresses them but, if elected, could do very little unless we also changed Congress, then we should be able to organize around them to try to change the makeup of political structures from top to bottom. Maybe we need to move into our local Democratic parties. The Moral Majority took over school boards with a specific agenda they could implement. Are there electoral institutions, as well as party institutions, that we should be aiming to reshape in our image?
DANIELLE ALLEN: Itâs a huge opportunity for Democrats, if they can take all the incoming young participants seriously and give them a real role in digging into hard policy questions. This is a chance to cultivate leaders who can run for office across the landscapeânot just national office, but local office. The Republicans have done a much better job, in all honesty, at growing up a generation of younger politicians. Democratic politicians skew older, so that sums up the real question about the Sanders moment: Is this enough of a wake-up call to the Democratic Party to start bringing talent in?
Itâs a trap!
ASTRA TAYLOR: The young thing, this millennial left turn, is great. But thereâs a part of me thatâs afraid. In the 1960s, the story was the counterculture and the new left. It was Students for a Democratic Society, the civil rights movement, the war in Vietnam. But thereâs been a lot of smart revisionist scholarship that says the story of the â60s was not the new left, it was actually the new right, which spent the decade laying the groundwork for its resurgence. At this moment, when left-wing millennials are getting a lot of attention, my fear is that thereâs a conservative counterpoint that Iâm just not seeing, because weâre all in our little social and political bubbles. We should study the split between the new left and the new right in the â60s, and make sure that history doesnât repeat itself.
The worst thing would be to ignore the split
DAVID SIMON: The Democrats are going to win, because theyâre up against Trump. But Iâm worried theyâre going to paper over a fundamental flaw in their coalition, which is: Youâve got to help working people and the middle-middle class. Theyâre not your guaranteed votes, and you lost them once to Reagan. Maybe you can do without them long-term. But I would get them back because (a) it secures your coalition going forward and (b) itâs the right thing to fething do.
JILL FILIPOVIC: The brawls that people are having on Twitter every dayâI donât know if thatâs healthy for the party. But the bigger debates are really important conversations to be having. Who is our coalition? Who are we representing, and how do we best do that? Do we want to be the center-left party of the â90s, or should we be serving a more diverse and liberal voter base? I donât think those conversations are going to destroy the party. I think theyâre going to set us in a better direction.
JACOB HACKER: Itâs nice to be able to talk about whatâs happening on the Democratic side, because all of the focus has been on the Republican side. Itâs a bit like living in a house thatâs got some peeling paint and holes in the roof. Right next to it is a derelict building thatâs practically falling over. And youâre like, âMan, Iâve got a nice house.â But if you just put your hand up and cover up your neighborâs house so you canât see it, youâd be like, âUm, I think my house needs some work.â The Democratic Party is kind of like that right now. I want to live there, but I really would love to upgrade it.
The best is yet to come
BY NAOMI KLEIN
Mark Peterson / Redux
On the surface, the battle between Hillary Clinton and Bernie Sanders looks like a deep rift, one that threatens to splinter the Democratic Party. But viewed in the sweep of history, it is evidence of something far more positive for the partyâs base and beyond: not a rift but a shiftâthe first tremors of a profound ideological realignment from which a transformative new politics could emerge.
Many of Bernieâs closest advisersâand perhaps even Bernie himselfânever imagined the campaign would do so well. And yet it did. The U.S. leftâand not some pale imitation of itâactually tasted electoral victory, in state after state after state. The campaign came so close to winning that many of us allowed ourselves to imagine, if only for a few, furtive moments, what the world would look like with a President Sanders.
Even writing those words seems crazy. After all, the working assumption for decades has been that genuinely redistributive policies are so unpopular in the U.S. that they could only be smuggled past the American public if they were wrapped in some sort of centrist disguise. âFee and dividendâ instead of a carbon tax. âHealth care reformâ instead of universal public health care.
Only now it turns out that left ideas are popular just as they are, utterly unadorned. Really popularâand in the most pro-capitalist country in the world.
Itâs not just that Sanders has won 20-plus contests, all while never disavowing his democratic socialism. Itâs also that, to keep Sanders from hijacking the nomination, Clinton has been forced to pivot sharply to the left and disavow her own history as a market-friendly centrist. Even Donald Trump threw out the economic playbook entrenched since Reaganâcoming out against corporate-friendly trade deals, vowing to protect whatâs left of the social safety net, and railing against the influence of money in politics.
Taken together, the evidence is clear: The left just won. Forget the nominationâI mean the argument. Clinton, and the 40-year ideological campaign she represents, has lost the battle of ideas. The spell of neoliberalism has been broken, crushed under the weight of lived experience and a mountain of data.
What for decades was unsayable is now being said out loudâfree college tuition, double the minimum wage, 100 percent renewable energy. And the crowds are cheering. With so much encouragement, who knows whatâs next? Reparations for slavery and colonialism? A guaranteed annual income? Democratic worker co-ops as the centerpiece of a green jobs program? Why not? The intellectual fencing that has constrained the leftâs imagination for so long is lying twisted on the ground.
This broad appetite for systemic change did not begin with Sanders. During the Obama years, a wave of radical new social movements emerged, from Occupy Wall Street and the Fight for $15 to #NoKXL and Black Lives Matter. Sanders harnessed much of this energyâbut by no means all of it. His weaknesses reaching certain segments of black and Latino voters in the Democratic base are well known. And for some activists, Sanders has always felt too much like the past to get overly excited about.
Looking beyond this election cycle, this is actually good news. If Sanders could come this far, imagine what a left candidate who was unburdened by his weaknesses could do. A political coalition that started from the premise that economic inequality and climate destabilization are inextricable from systems of racial and gender hierarchy could well build a significantly larger tent than the Sanders campaign managed to erect.
And if that movement has a bold plan for humanizing and democratizing new technology networks and global systems of trade, then it will feel less like a blast from the past, and more like a path to an exciting, never-before-attempted future. Whether coming after one term of Hillary Clinton in 2020, or one term of Donald Trump, that combinationâdeeply diverse and insistently forward-lookingâcould well prove unbeatable.
Loulou Cherinet Ã¤r fÃ¶dd 1970 och har stÃ¤llt ut pÃ¥ dom flesta konstbiennaler. Med sina dubbla rÃ¶tter bÃ¥de i Sverige och i Etiopien arbetar hon mestadels pÃ¥ det internationella fÃ¤ltet. Hon Ã¤r ocksÃ¥ professor pÃ¥ Konstfack i Stockholm, och just nu aktuell med fem videoverk pÃ¥ Moderna Museet, under titeln Who Learns My Lesson Complete? ett citat frÃ¥n poeten Walt Whitman.
I started to translate recovery conditions from compressed sensing into the graph signal setting. So far, I managed to relate the null space property and variants of the restricted isometry properties to the connectivity properties (topology) of networks. In particular, the conditions amount to the existence of certain network flowsâ. I would be happy if you have a look and share your opinion with me:
A main workhorse for statistical learning and signal processing using sparse models is the least absolute shrinkage and selection operator (Lasso). The Lasso has been adapted recently for massive network-structured datasets, i.e., big data over networks. In particular, the network Lasso allows to recover (or learn) graph signals from a small number of noisy signal samples by using the total variation semi-norm as a regularizer. Some work has been devoted to studying efficient and scalable implementations of the network Lasso. However, only little is known about the conditions on the underlying network structure which ensure a high accuracy of the network Lasso. By leveraging concepts of compressed sensing, we address this gap and derive precise conditions on the underlying network topology and sampling set which guarantee the network lasso to deliver an accurate estimate of the entire underlying graph signal.
We adapt the nullspace property of compressed sensing for sparse vectors to semi-supervised learning of labels for network-structured datasets. In particular, we derive a sufficient condition, which we term the network nullspace property, for convex optimization methods to accurately learn labels which form smooth graph signals. The network nullspace property involves both the network topology and the sampling strategy and can be used to guide the design of efficient sampling strategies, i.e., the selection of those data points whose labels provide the most information for the learning task.
This work proposes a novel method for semi-supervised learning from partially labeled massive network-structured datasets, i.e., big data over networks. We model the underlying hypothesis, which relates data points to labels, as a graph signal, defined over some graph (network) structure intrinsic to the dataset. Following the key principle of supervised learning, i.e., similar inputs yield similar outputs, we require the graph signals induced by labels to have small total variation. Accordingly, we formulate the problem of learning the labels of data points as a non-smooth convex optimization problem which amounts to balancing between the empirical loss, i.e., the discrepancy with some partially available label information, and the smoothness quantified by the total variation of the learned graph signal. We solve this optimization problem by appealing to a recently proposed preconditioned variant of the popular primal-dual method by Pock and Chambolle, which results in a sparse label propagation algorithm. This learning algorithm allows for a highly scalable implementation as message passing over the underlying data graph. By applying concepts of compressed sensing to the learning problem, we are also able to provide a transparent sufficient condition on the underlying network structure such that accurate learning of the labels is possible. We also present an implementation of the message passing formulation allows for a highly scalable implementation in big data frameworks.
OSLO: Scientists searching for everything from oil and gas to copper and gold are adopting techniques used by companies such as Netflix or Amazon to sift through vast amounts of data, a study showed on Tuesday.
The method has already helped to discover 10 carbon-bearing minerals and could be widely applied to exploration, they wrote in the journal American Mineralogist. "Big data points to new minerals, new deposits," they wrote of the findings.
The technique goes beyond traditional geology by amassing data about how and where minerals have formed, for instance by the cooling of lava after volcanic eruptions. The data can then be used to help find other deposits.
"Minerals occur on Earth in clusters," said Robert Hazen, executive director of the Deep Carbon Observatory at the Carnegie Institution for Science in Washington and an author of the study.
"When you see minerals together it's very like the way that humans interact in social networks such as Facebook," he said.
Hazen said the technique was also like Amazon, which recommends books based on a client's previous orders, or by media streaming company Netflix, which proposes movies based on a customer's past viewing habits.
"They are using vast amounts of data and make correlations that you could never make," he told Reuters.
Lead author Shaunna Morrison, also at the Deep Carbon Observatory and the Carnegie Institution, said luck often played a big role for geologists searching for new deposits.
"We are looking at it in a much more systematic way," she said of the project.
Among the 10 rare carbon-bearing minerals discovered by the project were abellaite and parisaite-(La). The minerals, whose existence was predicted before they were found, have no known economic applications.
Gilpin Robinson, of the U.S. Geological Survey (USGS) who was not involved in the study, said the USGS had started to collaborate with the big data project.
"The use of large data sets and analytical tools is very important in our studies of mineral and energy resources," he wrote in an email.
The DCO project will also try to collect data to examine the geological history of the Moon and Mars.
This is a podcast with Joseph George, Director of Big Data Servers, HP talking about the SL4500 servers from HP and other associated solutions for customers looking to store and process massive amounts of data.
Neue Werkzeuge und Services aus der Produktreihe SAP Leonardo hat die SAP vorgestellt. Die LÃ¶sungen sollen Unternehmen den Einsatz von Cloud-, Big Data- und dem Internet of Things ermÃ¶glichen. Auf der Konferenz Leonardo Live hat SAP eine Reihe von Services und Produkte im Umfeld von SAP Leonardo vorgestellt. Diese LÃ¶sung basierte auf der hauseigenen Cloud Platform und biete eine standardisierte Weiter lesen
The terms “volume”, “velocity”, and “variety” turn up often – very often – in discussions about “Big Data”. Â Gartner’s Doug Laney has laid claim to the original collective use of these terms, in a then-META Group article in February 2001 (“3D Data Management: Controlling Data Volume, Veocity, and Variety“). If Mr. Laney is indeed correct […]
NM-Albuquerque, CenturyLink (NYSE: CTL) is a global communications and IT services company focused on connecting its customers to the power of the digital world. CenturyLink offers network and data systems management, big data analytics, managed security services, hosting, cloud, and IT consulting services. The company provides broadband, voice, video, advanced data and managed network services over a robust 265,
I was too much fascinated towards semantic web technology before starting my Post Graduation, these days cloud computing fascinating me. I wonder when a significant approach developed to perceive, to learn and an innovative nature; but i think i got much from these academic years. Now i want to grow my knowledge upto next level, i want to meet, listen and work with experienced persons of IT industry. For that ICCBDT is looking as an opportunity for me. Definitely each event related to our field specially subject is related to your fascinated area will make excited to attained.
Cloud computing is not quite new technology in industry but still newbie don't know what it is actually. Sometimes little knowledge also causes technology adoption failure, nobody adopt any technology until it is beneficial to him. There is a big problem with cloud is that everyone think that cloud computing is insecure. Yes, security is a big issue, but if we know any threat then thread handling can be easily provided. I also have some questions about cloud computing in my mind. I will write more about these issues after attaining this event, waiting for that. If you also have some queries then you should attained this International conference which is organised by same university in which i am studying that is RGPV, Bhopal. The conference will held from 13th November 2013 to 15th November in Bhopal in collaboration of EMC2. for more detail About this event you can visit: http://www.rgpv.ac.in/ICCBDT/index.html
In this e-guide, Sisense co-founder Adi Azaria and Machina Research analyst Emil Berthelsen examine the current state of IoT analytics as well as where it's heading next. Learn about the impact IoT is having on everything from Hadoop big data platforms to real-time BI capabilities. Published by: Vitria Technology, Inc.
Want to get the most out of your big data? Build an enterprise data hub (EDH). Big data is rapidly getting bigger. That in itself isnât a problem. The issue is what Gartner analyst Doug Laney describes as the three Vs of Big Data: volume, velocity, and variety. Volume refers to the ever-growing […]
The Internet of Everything continues to gain momentum and every new connection is creating new data. Cisco UCS Integrated Infrastructure for Big Data is helping customers convert that data into powerful intelligence, and weâre working with a number of new partners to bring exciting new solutions to our customers. Today, I want to spotlight Elasticsearch, […]
Big Data is not just about gathering tons of data, the digital exhaust from the internet, social media, and customer records. Â The real value is in being able to analyze the data to gain a desired business outcome. Â Those of us who follow the Big Data market closely never lack for something new to […]
Big Data remains one of the hottest topics in the industry due to the actual dollar value that businesses are deriving from making sense from tons of structured and unstructured data.Â Virtually every field is leveraging a data-driven strategy as people, process, data and things are increasing being connected (Internet of Everything). New tools and […]
The Cloudera Sessions Roadshow helps companies to navigate the Big Data journey. Â As Hadoop takes the data management market by storm, organizations are evolving the role it plays in the modern data center. This disruptive technology is quickly transforming an industry, the value it adds to the modern data center, and how you can […]
Huge amounts of information are flooding companies every second, which has led to an increased focus on big data and the ability to capture and analyze this sea of information. Enterprises are turning to big data and Apache Hadoop in order to improve business performance and provide a competitive advantage. But to unlock business value […]
With enough hype to rival even the most popular of Superbowl’s, Big Data experts will converge on New York City in just a couple weeks! Â But big data has good reason for all the hype as businesses continue to find new ways to leverage the insights derived from vast data pools that are continuing […]
Big Data has become mainstream as businesses realize its benefits, including improved operation efficiency, better customer experience, and more accurate predictions. However, companies are often challenged by the complexities of traditional server solutions. In this webinar, learn how to unlock the value of Big Data with the Cisco Unified Computing System (Cisco UCS). […]
Cloudera Sessions is coming to a City Near You! Have you registered for the upcoming Cloudera Sessions roadshow yet? Â According to IDC Analysts, the market for Big Data will reach $16.9 billion by 2015, with an outrageous 40% CAGR. As the sheer volume of data continues to climb, enterprise customers will need […]
As soaring health budgets continue to cause pain for governments and medical care providers, relief may be in sight thanks to big data. A new study by Lux Research found that advanced big data and analytics technologies are poised to help rein in runaway healthcare costs. The report âIndustrial Big Data and Analytics in Digital... Read more »
SequelGate is one of the best training institutes for Data Science & Big Data /Data Analytics Training . We have been providing Classroom and Classroom Trainings and Corporate training.
All our training sessions are COMPLETELY PRACTICAL.
2014 will be remembered as the year the cyber dam broke, breached by sophisticated hackers who submerged international corporations and government agencies in a flood of hurt. Apple, Yahoo, PF Changs, AT&T, Google, Walmart, Dairy Queen, UPS, eBay, Neiman Marcus, US Department of Energy and the IRS all reported major losses of private data relating to customers, patients, taxpayers and employees. Breaches at Boeing, US Transportation Command, US Army Corps of Engineers, and US Investigations Services (who runs the FBIâs security clearance checks) reported serious breaches of national security. Prior to last year, devastating economic losses had accrued only to direct targets of cyberwarfare, such as RSA and Saudi Aramaco, but in 2014, at least five companies with no military ties -- JP Morgan, Target, Sony, Kmart, and Home Depot â incurred losses exceeding $100M from forensic expenses, investments in remediation, fines, legal fees, re-organizations, and class-action lawsuits, not to mention damaged brands.
The press has already reported on where things went wrong at each company, promoting a false sense of security based on the delusion that remediating this vulnerability or that one would have prevented the damage. This kind of forensic review works for aviation disasters, where we have mature, well understood systems and we can fix the problems we find in an airplane. But information networks are constantly changing, and adversaries constantly invent new exploits. If one doesnât work, they simply use another, and therein lies the folly of forensics.
Only when you step back and look at 2014 more broadly can you see a pattern that points toward a systemic failure of the security infrastructure underlying corporate networks, described below. So until we see a seismic shift in how vendors and enterprises think about security, hackers will only accelerate their pace of âownershipâ of corporate and government data assets.
The Sprawl of Cyberwarfare
The breaches of 2014 demonstrate how cyberwarfare has fueled the rampant spread of cyber crime.
For the past decade, the worldâs three superpowers, as well as UK, North Korea and Israel, quietly developed offensive capabilities for the purposes of espionage and military action. Destructive attacks by geopolitical adversaries have clearly been reported on private and public sector targets in the US, Iran, South Korea, North Korea, Israel, Saudi Arabia and elsewhere. While Snowden exposed the extent of cyber espionage by the US, no one doubts that other nations prowl cyberspace to a similar or greater extent.
The technical distinction of these national cyber agencies is that they developed the means to target specific data assets or systems around the world, and to work their way through complex networks, over months or years, to achieve their missions. Only a state could commit the necessary combination of resources for such a targeted attack: the technical talent to create zero-day exploits and stealthy implants; labs that duplicate the target environment (e.g. the Siemens centrifuges of a nuclear enrichment facility); the field agents to conduct on-site ops (e.g. monitoring wireless communications, finding USB ports, or gaining employment); and years of patience. As a result of these investments in âmilitary gradeâ cyber attacks, the best of these teams can boast a mission success rate close to 100%.
But cyber weapons are even harder to contain than conventional ones. Cyberwar victories have inspired terrorists, hacktivists and criminals to follow suit, recruiting cyber veterans and investing in the military grade approach. (Plus, some nations have started targeting companies directly.) No longer content to publish malware and wait for whatever data pop up, criminals now identify the crown jewels of businesses and target them with what we call Advanced Persistent Threats (APTs). You want credit cards? Get 56 million of them from Home Depot. You want to compromise people with the most sensitive secrets? Go to straight to the FBIâs archive of security clearances. You want the design of a new aircraft? Get it from Boeing. You need data for committing online bank theft? Get it for 76 million households at JP Morgan Chase.
Thatâs why cyberspace exploded in 2014.
This is Not the Common Cold
But why are the crown jewels so exposed? Havenât these companies all spent millions of dollars every year on firewalls, anti-virus software, and other security products? Donât their IT departments have security engineers and analysts to detect and deflect these attacks?
The problem is that up until this year, corporate networks were instrumented to defend against generic malware attacks that cause minimal damage to each victim. Generic malware might redirect your search page, crash your hard drive, or install a bot to send spam or mine bitcoin. Itâs not looking for your crown jewels because it doesnât know who you are. It may worm its way to neighboring machines, but only in a singular, rudimentary way that jumps at most one or two hops. Itâs automated and scalable â stealing pennies from all instead of fortunes from a few. If it compromises a few machines here and there, no big deal.
But with Advanced Persistent Threats, a human hacker directs the activity, carefully spreading the implant, so even the first point of infection can lead to devastation. These attacks are more like Ebola than the common cold, so what we today call state-of-the-art security is only slightly more effective than taking Airborne (and thatâs a low bar). As long as corporate networks are porous to any infection at all, hackers can launch stealth campaigns jumping from host to host as they map the network, steal passwords, spread their agents, and exfiltrate data. Doubling down on malware filters will help, but it can never be 100% effective. All it takes is one zero-day exploit, or a single imprudent click on a malicious email, tweet or search result, for the campaign to begin. Or the attacker can simply buy a point of entry from the multitudes of hackers who already have bots running on the Internet.
Too Big Data
The dependence on malware filters is only half the problem. Ask any Chief Information Officer about his or her security infrastructure and you will hear all about the Secure Operation Center in which analysts pour over alerts and log files (maybe even 24/7) identifying anomalies that may indicate security incidents. These analysts are tasked with investigating the incidents and rooting out any unauthorized activity inside the network. So even if someone can trespass the network, analysts will stop them. And indeed, thousands of security products today participate in the ecosystem by finding anomalies and generating alerts for the Security Information and Event Management (SIEM) system. Every week a new startup pops up, touting an innovative way to plow through log files, network stats, and other Big Data to identify anomalies.
But sometimes anomalies are just anomalies, and thatâs why a human analyst has to investigate each alert before taking any pre-emptive action, such as locking a user out of the network or re-imaging a host. And with so many products producing so many anomalies, they are overwhelmed with too much data. They typically see a thousand incidents every day, with enough time to investigate twenty. (You can try to find more qualified analysts but only with diminishing returns, as each one sees less of the overall picture.)
Thatâs why, for example, when a FireEye system at Target spotted the malware used to exfiltrate 40 million credit cards, it generated an alert for the Secure Operations Center in Minneapolis, and nothing happened. Similarly, a forensic review at Neiman Marcus revealed more than 60 days of uninvestigated alerts that pointed to exfiltrating malware. SONY knew they were under attack for two years leading up to their catastrophic breach, and still they couldnât find the needles in the haystack.
And yet, the drumbeat marches on, as security vendors old and new continue to tout their abilities to find anomalies.They pile more and more alerts into the SIEM, guaranteeing that most will drop on the floor. No wonder APTs are so successful.
A Three Step Program
"Know Thy Self, Know Thy Enemy" - Sun Tzu, The Art of War
We need to adapt to this new reality, and the cyber security industry needs to enable it. Simply put, businesses need to focus their time and capital on stopping the most devastating attacks.
The first step here is to figure out what those attacks look like. What are your crown jewels? What are the worst case scenarios? Do you have patient data, credit cards, stealth fighter designs, a billion dollars in the bank, damning emails, or a critical server that, if crippled by a Distributed Denial of Service attack, would cause your customers to instantly drop you? As you prioritize the threats, identify your adversaries. Is it a foreign competitor, Anonymous, disgruntled employees, or North Korea? Every business is different, and each has a different boogeyman. The good news is that even though most CEOâs have never thought about it, this first step is easy and nearly free. (Cyber experts like Good Harbor or the BVP-funded K2 Intelligence can facilitate the process.)
Second, businesses need real-time threat intelligence that relate to their unique threatscapes. Almost every security technology depends upon a Black List that identifies malicious IP addresses, device fingerprints, host names, domains, executables or email addresses, but naturally they come with generic, one-size-fits-all data. Dozens of startups now sell specialized threat intel, such as BVP-fundedInternet Identity, which allows clusters of similar companies to pool their cyber intelligence, or BVP-fundediSight Partners, whose global field force of over 100 analysts track and profile cyber adversaries and how to spot them in your network. What better way for your analysts to investigate the most important incidents, than to prioritize the ones associated with your most formidable adversaries?
"This is a global problem. We don't have a malware problem. We have an adversary problem. There are people being paid to try to get inside our systems 24/7"
And finally, security analysts need fewer alerts, not more. Instead of finding more anomalies, startups would better spend their time finding ways to eliminate alerts that donât matter, and highlighting the ones that do. They would provide the analysts with better tools for connecting the alerts into incidents and campaigns, tapping into the skills of experienced âmilitary gradeâ hackers to profile the attack patterns.
The challenge of securing data today is obviously complex, with many other pressing opportunities for improvement such as cloud security, mobile security, application security and encryption. But as cyberwar spreads to the commercial Internet, re-orienting enterprise security to focus on Advanced Persistent Threats should be the single most important initiative for businesses and vendors alike. Of course, inertia is powerful, and it may take boards of directors, CISOs, product managers, entrepreneurs, and venture capitalists another tumultuous year in cyberspace to get the message.
PCs and smartphones have pushed mainframes to the brink of extinction on Earth, and yet mainframes still thrive in space.
Most every satellite in orbit is a floating dinosaur - a bloated, one-off, expensive, often militarized, monolithic relic of the mainframe era. The opportunity for entrepreneurs today is to launch modern computer networks into space, disrupting our aging infrastructure with an Internet of microsats.
So why has it taken so long for modern computing to reach space? Gravity. Itâs hard to launch things. Governments have the money and patience to do it, as do large cable and telecom corporations. These players are slow to innovate, and large satellites have met their basic needs around science, defense, and communications, albeit at very high costs.
Thatâs changing:several IT trends have come together to herald the extinction of these orbiting pterodactyls:
Mooreâs law has reached the point where a single rocket launch can be amortized across dozens of tiny satellites, and the replacement cost is so low that we neednât burden our missions with triple redundancies and a decade of testing
Global computing clouds make it easy to deploy ground stations; and
Advances in Big Data enable us to process the torrential flows of information we get from distributed networks
These trends have reduced the cost of a single aerospace mission from a billion dollars down to a hundred million just as the early-stage VC community amassed enough capital to undertake projects of this scope. And now that a handful of venture-backed startups like SpaceX and Skybox are demonstrating success, the number of aerospace business plans circulating through Sand Hill Road has climbed faster than a Falcon 9.
With each successful startup, progress accelerates and synergies emerge. As SpaceX makes launches cheaper, it opens the frontier to more entrepreneurs. Pioneers like Skybox and Planet Labs have to build end-to-end solutions for their markets, including everything from satellite buses to big data search algorithms; but there will soon evolve an ecosystem of vendors who specialize in launch mechanisms, cubesats, sensors, inter-sat communications, analytics, and software applications.
So who are the customers for a space-based Internet? At first, aerospace startups will disrupt two large markets:
Â·Scientific exploration of space.In the past, costly scientific missions such as Apollo ($355 million in 1966), ISS ($3 billion/year), Hubble ($10 billion), and Cassini ($3.3 billion) were designed and built by government agencies. Expect startups to disrupt this market with innovations in rocketry, robotics, optics, cloud computing, space suits, renewable energy, and more.
Â·Communications. Government defense agencies spend considerable sums on communications to serve their space-based weapon systems and intelligence bureaus. Media and cable companies also commission satellites to serve their consumers. Microsat networks of radios will supply these customers more cheaply and reliably.
While spatial avionics improve with Mooreâs Law, certainly some payloads, like telescopes and robots, cannot be miniaturized beyond the constraints of physics. But even these missions will benefit from the cheap, rapid testing available on a nanosatellite. Just as programmers today can build entire software companies using a free A.W.S. account and the open source LAMP stack, space-faring entrepreneurs can now explore myriads of new business models by launching $1,000 cubesats out of ISS.
In addition to disrupting existing markets, microsat networks in space will enable a new and important capability: Planetary Awareness. When we surround our planet with sensors across the frequency spectrum, we will have access to data that opens up new markets. Today, we have sensors across our landmasses, but adding sensors in space, the ocean, and the atmosphere will illuminate both natural phenomena and human logistics.
Planetary Awareness will enable many capabilities of high social value:
oAviation and maritime safety: The need for tracking and communicating with aircraft and ships is in the public eye today following the loss of flight MH370.
oNature surveillance: Predict and monitor weather, global warming, natural disasters, and the risk of meteor damage (as pioneered by the B612 Foundation).
oGlobal journalism: Expose protests, genocides, and other state-censored events.
Planetary Awareness will also open new markets of high economic value, which are much more likely to drive the success of aerospace startups:
oFinding natural resources: Minerals and fuel sources abound upon the ocean floor (as discovered by Liquid Roboticsâ fleet of WaveGliders) and near-Earth asteroids (as Planetary Resources promises to find using cheap microsats).
oFinancial services: Tracking human activity and commerce (e.g. the proverbial counting of cars in parking lots) yields valuable data to merchants, logistics providers and investors.
oMilitary and geopolitical intelligence: Governments already purchase imagery for this use, but visibility will greatly expand from more frequent flyovers, video, radio surveillance, and automated analytics.
Geospatial imaging attracts many startups because it is already a robust and underserved market, but the opportunity to enable planetary awareness is much broader.Dan Berkenstock didnât start Skybox Imaging just to sell images and video: he had a more profound vision for the impact that startups can have on the aerospace industry.His mission attracted co-founders from Stanford and NASA, his CEO Tom Ingersoll from Universal Space, aerospace legends like Joe Rothenberg who led the Hubble repair as well as other star engineers and investors. And now Skybox is proving that they, along with SpaceX and other nimble startups, will displace dinosaurs in space with data services driven by constellations of smart microsats.
Our growing computer security problems will create many new companies.
The threat from cyber-intrusions seems to have exploded in just the last 18 months. Mainstream media now report regularly on massive, targeted data breaches and on the digital skirmishes waged among nation states and cybermilitants.
Unlike other looming technical problems that require innovation to address, cybersecurity never gets solved. The challenges of circuit miniaturization, graphical computing, database management, network routing, server virtualization, and similarly mammoth technical problems eventually wane as we tame their complexity. Cybersecurity is a never-ending Tom and Jerry cartoon. Like antibiotic-resistant bacteria, attackers adapt to our defenses and render them obsolete.
As in most areas of IT and computing, innovation in security springs mostly from startup companies. Larger systems companies like Symantec, Microsoft, and Cisco contribute to the corpus of cybersecurity, but mostly acquire their new technologies from startups. Government agencies with sophisticated cyberskills tend to innovate more on the offensive side. I think that in the coming years we will see many small, creative teams of security engineers successfully discovering, testing, and building out clever new ways to secure cyberspace.
Anyone looking to found or invest in one of those small security companies destined for success should focus on the tsunami of change rocking the IT world known as cloud computing. In a transformation that eclipses even the advent of clientâserver computing in the 1980s, business are choosing to subscribe to services in the cloud over running software on their own physical servers. Incumbents in every category of software are being disrupted by cloud-based upstarts. According to Forrester, the global market for cloud computing will grow more than sixfold this decade, to over a quarter trillion dollars.
Cloud security, as it is known, is today one of the less mature areas of cloud computing, but it has already become clear that it will become a significant chunk of that vast new market. A Gartner report earlier this year predicted that the growth of cloud-based security services would overtake traditional security services in the next three years.
Just like other software products, conventional security appliances are being replaced by cloud-based alternatives that are easier to deploy, cheaper to manage, and always up-to-date. Cloud-based security protections can also be more secure, since the vendor can correlate events and profile attacks across all of its customersâ networks. This collaborative capability will be critical in the coming years as the private sector looks to government agencies like the National Security Agency for protection from cyberattacks.
The cloud also enables new security services based on so-called big data, which could simply not exist as standalone products. Companies like SumoLogic can harvest signals from around the Web for analysis, identifying attacks and attackers that couldnât be detected using data from a single incident or source.
These new data-centric, cloud-based security products are crucial to solving the challenges of keeping mobile devices secure. Most computers shipped today are mobile devices, and they make juicier targets than PCs because they have location and payment data, microphones, and cameras. But mobile carriers and employers cannot lock down phones and tablets completely because they are personal devices customized with personal apps. Worse, phones and tablets lack the processing power and battery life to run security processes as PCs do.
Cloud approaches to security offer a solution. Software-as-a-service security companies like Zscaler can scan our mobile data traffic using proxies and VPNs, scrubbing them for malware, phishing, data leaks, and bots. In addition we see startups like Blue Cava, Iovation, and mSignia using Big Data to prevent fraud by fingerprinting mobile devices.
Cloud security also involves protecting cloud infrastructure itself. New technologies are needed to secure the client data inside cloud-based services against theft or manipulation during transit or storage. Some security auditors and security companies already sell into this market, but most cloud developers, focused on strong customer growth, have been slow to deploy strong security. Eventually it should become possible for cloud computing customers to encrypt and destroy data using their own encryption keys. Until they do, there is an opportunity for startups such as CipherCloud and Vaultive to sell encryption technology that is used by companies over the top of their cloud services to encrypt the data inside.
Lastly, cloud security also includes protecting against the cloud, which enables creative new classes of attack. For example, Amazon Web Services can be used for brute force attacks on cryptographic protocols, like that one German hacker used in 2010 to break the NSAâs Secure Hashing Algorithm. Attackers can use botnets and virtual servers to wage distributed denial of service attacks; and bots can bypass captcha defenses by crowdsourcing the answers. Cloud-based attacks demand innovative defenses that will likely come from startups. For example, Prolexic and Defense.net (a company Bessemer has invested in) operate networks of filters that buffer their clients from cloud-based DDOS attacks.
Cloud computing may open up enormous vulnerabilities on the Internet, but it also presents great opportunity for innovative cybersecurity. In the coming decade, few areas of computing will be as attractive to entrepreneurs, technologists, and investors.
El AC45 Land Rover BAR de Ben Ainslie renueva su triunfo en la regata Louis Vuitton Series en Portstmouth (Reino Unido). La embarcaciÃ³n emplea un sistema de 'Big Data' de la FÃ³rmula 1 para competir. Leer
Windows Azure is a platform that has you covered, whether you need to write software, run software that is already written, or Install and use “canned” software whether you or someone else wrote it. Like any platform, it’s a set of tools you can use where it makes sense to solve a problem.
You can click on the graphic below for a larger picture of these components, or download a poster with more details here.
The primary location for Windows Azure information is located at http://windowsazure.com. You can find everything there from the development kits for writing software to pricing, licensing and tutorials on all of that.
I have a few links here for learning to use Windows Azure – although it’s best if you focus not on the tools, but what you want to solve. I’ve got it broken down here into various sections, so you can quickly locate things you want to know. I’ll include resources here from Microsoft and elsewhere – I use these same resources in the Architectural Design Sessions (ADS) I do with my clients worldwide.
Also called “Platform as a Service” (PaaS), Windows Azure has lots of components you can use together or separately that allow you to write software in .NET or various Open Source languages to work completely online, or in partnership with code you have on-premises or both – even if you’re using other cloud providers. Keep in mind that all of the features you see here can be used together, or independently. For instance, you might only use a Web Site, or use Storage, but you can use both together. You can access all of these components through standard REST API calls, or using our Software Development Kit’s API’s, which are a lot easier. In any case, you simply use Visual Studio, Eclipse, Cloud9 IDE, or even a text editor to write your code from a Mac, PC or Linux.
Components you can use:
Azure Web Sites: Windows Azure Web Sites allow you to quickly write an deploy websites, without setting a Virtual Machine, installing a web server or configuring complex settings. They work alone, with other Windows Azure Web Sites, or with other parts of Windows Azure. Read more about deciding to use Web Sites or Roles.
Web and Worker Roles: Windows Azure Web Roles give you a full stateless computing instance with Internet Information Services (IIS) installed and configured. Windows Azure Worker Roles give you a full stateless computing instance without Information Services (IIS) installed, often used in a "Services" mode. Scale-out is achieved either manually or programmatically under your control.
Storage: Windows Azure Storage types include Blobs to store raw binary data, Tables to use key/value pair data (like NoSQL data structures), Queues that allow interaction between stateless roles, and a relational SQL Server database.
Other Services: Windows Azure has many other services such as a security mechanism, a Cache (memcacheD compliant), a Service Bus, a Traffic Manager and more. Once again, these features can be used with a Windows Azure project, or alone based on your needs.
Various Languages: Windows Azure supports the .NET stack of languages, as well as many Open-Source languages like Java, Python, PHP, Ruby, NodeJS, C++ and more.
Also called “Software as a Service” (SaaS) this often means consumer or business-level software like Hotmail or Office 365. In other words, you simply log on, use the software, and log off – there’s nothing to install, and little to even configure. For the Information Technology professional, however, It’s not quite the same. We want software that provides services, but in a platform. That means we want things like Hadoop or other software we don’t want to have to install and configure.
Components you can use:
Kits: Various software “kits” or packages are supported with just a few clicks, such as Umbraco, Wordpress, and others.
Windows Azure Media Services: Windows Azure Media Services is a suite of services that allows you to upload media for encoding, processing and even streaming – or even one or more of those functions. We can add DRM and even commercials to your media if you like. Windows Azure Media Services is used to stream large events all the way down to small training videos.
Windows Azure Marketplace: Windows Azure Marketplace offers data and programs you can quickly implement and use – some free, some for-fee.
Also known as “Infrastructure as a Service” (IaaS), this offering allows you to build or simply choose a Virtual Machine to run server-based software.
Components you can use:
Persistent Virtual Machines: You can choose to install Windows Server, Windows Server with Active Directory, with SQL Server, or even SharePoint from a pre-configured gallery. You can configure your own server images with standard Hyper-V technology and load them yourselves – and even bring them back when you’re done. As a new offering, we also even allow you to select various distributions of Linux – a first for Microsoft.
Storage: Windows Azure Storage can be used as a remote backup, a hybrid storage location and more using software or even hardware appliances.
With all of these options, you can use Windows Azure to solve just about any computing problem. It’s often hard to know when to use something on-premises, in the cloud, and what kind of service to use.
I’ve used a decision matrix in the last couple of years to take a particular problem and choose the proper technology to solve it. It’s all about options – there is no “silver bullet”, whether that’s Windows Azure or any other set of functions. I take the problem, decide which particular component I want to own and control – and choose the column that has that box darkened. For instance, if I have to control the wiring for a solution (a requirement in some military and government installations), that means the “Networking” component needs to be dark, and so I select the “On Premises” column for that particular solution. If I just need the solution provided and I want no control at all, I can look as “Software as a Service” solutions.
Security: Security is one of the first questions you should ask in any distributed computing environment. We have certification info, coding guidelines and more, even a general “Request for Information” RFI Response already created for you.
(As with all of these types of posts, check the date of the latest update I’ve made here. Anything older than 6 months is probably out of date, given the speed with which we release new features into Windows and SQL Azure)
I don’t normally like to discuss things in terms of tools. I find that whenever you start with a given tool (or even a tool stack) it’s too easy to fit the problem to the tool(s), rather than the other way around as it should be.
That being said, it’s often useful to have an example to work through to better understand a concept. But like many ideas in Computer Science, “Big Data” is too broad a term in use to show a single example that brings out the multiple processes, use-cases and patterns you can use it for.
So we turn to a description of the tools you can use to analyze large data sets. “Big Data” is a term used lately to describe data sets that have the “Four V’s” as a characteristic, but I have a simpler definition I like to use:
Big Data involves a data set too large to process in a reasonable period of time
I realize that’s a bit broad, but in my mind it answers the question and is fairly future-proof. The general idea is that you want to analyze some data, and using whatever current methods, storage, compute and so on that you have at hand it doesn’t allow you to finish processing it in a time period that you are comfortable with. I’ll explain some new tools you can use for this processing.
Yes, this post is Microsoft-centric. There are probably posts from other vendors and open-source that cover this process in the way they best see fit. And of course you can always “mix and match”, meaning using Microsoft for one or more parts of the process and other vendors or open-source for another. I never advise that you use any one vendor blindly - educate yourself, examine the facts, perform some tests and choose whatever mix of technologies best solves your problem.
At the risk of being vendor-specific, and probably incomplete, I use the following short list of tools Microsoft has for working with “Big Data”. There is no single package that performs all phases of analysis. These tools are what I use; they should not be taken as a Microsoft authoritative testament to the toolset we’ll finalize for a given problem-space. In fact, that’s the key: find the problem and then fit the tools to that.
I break up the analysis of the data into two process types. The first is examining and processing the data in-line, meaning as the data passes through some process. The second is a store-analyze-present process.
Processing Data In-Line
Processing data in-line means that the data doesn’t have a destination - it remains in the source system. But as it moves from an input or is routed to storage within the source system, various methods are available to examine the data as it passes, and either trigger some action or create some analysis.
You might not think of this as “Big Data”, but in fact it can be. Organizations have huge amounts of data stored in multiple systems. Many times the data from these systems do not end up in a database for evaluation. There are options, however, to evaluate that data real-time and either act on the data or perhaps copy or stream it to another process for evaluation.
The advantage of an in-stream data analysis is that you don’t necessarily have to store the data again to work with it. That’s also a disadvantage - depending on how you architect the solution, you might not retain a historical record. One method of dealing with this requirement is to trigger a rollup collection or a more detailed collection based on the event.
StreamInsight - StreamInsight is Microsoft’s “Complex Event Processing” or CEP engine. This product, hooked into SQL Server 2008R2, has multiple ways of interacting with a data flow. You can create adapters to talk with systems, and then examine the data mid-stream and create triggers to do something with it. You can read more about StreamInsight here: http://msdn.microsoft.com/en-us/library/ee391416(v=sql.110).aspx
BizTalk - When there is more latency available between the initiation of the data and its processing, you can use Microsoft BizTalk. This is a message-passing and Service Bus oriented tool, and it can also be used to join system’s data together than normally does not have a direct link, for instance a Mainframe system to SQL Server. You can learn more about BizTalk here: http://www.microsoft.com/biztalk/en/us/overview.aspx
.NET and the Windows Azure Service Bus - Along the same lines as BizTalk but with a more programming-oriented design are the Windows and Windows Azure Service Bus tools. The Service Bus allows you to pass messages as well, and opens up web interactions and even inter-company routing. BizTalk can do this as well, but the Service Bus tools use an API approach for designing the flow and interfaces you want. The Service Bus offerings are also intended as near real-time, not as a streaming interface. You can learn more about the Windows Azure Service Bus here: http://www.windowsazure.com/en-us/home/tour/service-bus/ and more about the Event Processing side here: http://msdn.microsoft.com/en-us/magazine/dd569756.aspx
A more traditional approach with an organization’s data is to store the data and analyze it out-of-band. This began with simply running code over a data store, but as locking and blocking became an issue on a file system, Relational Database Management Systems (RDBMs) were created. Over time a distinction was made between data used in an online processing system, meant to be highly available for writing data (OLTP) and systems designed for analytical and reporting purposes (OLAP).
Later the data grew larger than these systems were designed for, primarily due to consistency requirements. In analysis, however, consistency isn’t always a requirement, and so file-based systems for that analysis were re-introduced from the Mainframe concepts, with new technology layered in for speed and size.
I normally break up the process of analyzing large data sets into four phases:
Source and Transfer - Obtaining the data at its source and transferring or loading it into the storage; optionally transforming it along the way
Store and Process - Data is stored on some sort of persistence, and in some cases an engine handles the acquisition and placement on persistent storage, as well as retrieval through an interface.
Analysis - A new layer introduced with “Big Data” is a separate analysis step. This is dependent on the engine or storage methodology, is often programming language or script based, and sometimes re-introduces the analysis back into the data. Some engines and processes combine this function into the previous phase.
Presentation - In most cases, the data wants a graphical representation to comprehend, especially in a series or trend analysis. In other cases a simple symbolic representation, similar to the “dashboard” elements in a Business Intelligence suite. Presentation tools may also have an analysis or refinement capability to allow end-users to work with the data sets. As in the Analysis phase, some methodologies bundle in the Analysis and Presentation phases into one toolset.
Source and Transfer
You’ll notice in this area, along with those that follow, Microsoft is adopting not only its own technologies but those within open-source. This is a positive sign, and means that you will have a best-of-breed, supported set of tools to move the data from one location to another. Traditional file-copy, File Transfer Protocol and more are certainly options, but do not normally deal with moving datasets.
I’ve already mentioned the ability of a streaming tool to push data into a store-analyze-present model, so I’ll follow up that discussion with the tools that can extract data from one source and place it in another.
SQL Server Integration Services (SSIS)/SQL Server Bulk Copy Program (BCP)- SSIS is a SQL Server tool used to move data from one location to another, and optionally perform transform or other processes as it does so. You are not limited to working with SQL Server data - in fact, almost any modern source of data from text to various database platforms is available to move to various systems. It is also extremely fast and has a rich development environment. You can learn more about SSIS here: http://msdn.microsoft.com/en-us/library/ms141026.aspx BCP is a tool that has been used with SQL Server data since the first releases; it has multiple sources and destinations as well. It is a command-line utility,and has some limited transform capabilities. You can learn more about BCP here: http://msdn.microsoft.com/en-us/library/ms162802.aspx
Sqoop- Tied to Microsoft’s latest announcements with Hadoop on Windows and Windows Azure, Sqoop is a tool that is used to move data between SQL Server 2008R2 (and higher) and Hadoop, quickly and efficiently. You can read more about that in the Readme file here: http://www.microsoft.com/download/en/details.aspx?id=27584
Application Programming Interfaces - API’s exist in most every major language that can connect to one data source, access data, optionally transforming it and storing it in another system. Most every dialect of the .NET-based languages contain methods to perform this task.
Store and Process
Data at rest is normally used for historical analysis. In some cases this analysis is performed near real-time, and in others historical data is analyzed periodically. Systems that handle data at rest range from simple storage to active management engines.
SQL Server - Microsoft’s flagship RDBMS can indeed store massive amounts of complex data. I am familiar with a two systems in excess of 300 Terabytes of federated data, and the Pan-Starrs project is designed to handle 1+ Petabyte of data. The theoretical limit of SQL Server DataCenter edition is 540 Petabytes. SQL Server is an engine, so the data access and storage is handled in an abstract layer that also handles concurrency for ACID properties. You can learn more about SQL Server here: http://www.microsoft.com/sqlserver/en/us/product-info/compare.aspx
HPC Server - Microsoft’s High-Performance Computing version of Windows Server deals not only with large data sets, but with extremely complicated computing requirements. A scale-out architecture and inter-operation with Linux systems, as well as dozens of applications pre-written to work with this server make this a capable “Big Data” system. It is a mature offering, with a long track record of success in scientific, financial and other areas of data processing. It is available both on premises and in Windows Azure, and also in a hybrid of both models, allowing you to “rent” a super-computer when needed. You can read more about it here: http://www.microsoft.com/hpc/en/us/product/cluster-computing.aspx
Windows and Azure Storage - Although not an engine - other than a triple-redundant, immediately consistent commit - Windows Azure can hold terabytes of information and make it available to everything from the R programming language to the Hadoop offering. Binary storage (Blobs) and Table storage (Key-Value Pair) data can be queried across a distributed environment. You can learn more about Windows Azure storage here: http://msdn.microsoft.com/en-us/library/windowsazure/gg433040.aspx
In a “Big Data” environment, it’s not unusual to have a specialized set of tasks for analyzing and even interpreting the data. This is a new field called “data Science”, with a requirement not only for computing, but also a heavy emphasis on math.
Transact-SQL - T-SQL is the dialect of the Structured Query Language used by Microsoft. It includes not only robust selection, updating and manipulating of data, but also analytical and domain-level interrogation as well. It can be used on SQL Server, PDW and ODBC data sources. You can read more about T-SQL here: http://msdn.microsoft.com/en-us/library/bb510741.aspx
Application Programming Interfaces - Almost all of the analysis offerings have associated API’s - of special note is Microsoft Research’s Infer.NET, a new language construct for framework for running Bayesian inference in graphical models, as well as probabilistic programming. You can read more about Infer.NET here: http://research.microsoft.com/en-us/um/cambridge/projects/infernet/
Lots of tools work in presenting the data once you have done the primary analysis. In fact, there’s a great video of a comparison of various tools here: http://msbiacademy.com/Lesson.aspx?id=73 Primarily focused on Business Intelligence. That term itself is now not as completely defined, but the tools I’ll show below can be used in multiple ways - not just traditional Business Intelligence scenarios. Application Programming Interfaces (API’s) can also be used for presentation; but I’ll focus here on “out of the box” tools.
SharePoint Services - Microsoft has rolled several capable tools in SharePoint as “Services”. This has the advantage of being able to integrate into the working environment of many companies. You can read more about lots of these reporting and analytic presentation tools here: http://technet.microsoft.com/en-us/sharepoint/ee692578
This is by no means an exhaustive list - more capabilities are added all the time to Microsoft’s products, and things will surely shift and merge as time goes on. Expect today’s “Big Data” to be tomorrow’s “Laptop Environment”.
A Medical imaging SAAS company is seeking for a talented DBA (infrastructure). You will work with big data, Scale the company's data set to millions of samples, perform large scale data analysis and research, handle performance, scale, availability, accuracy and monitoring.
Join Larry Jordan and co-host Michael Horton as they talk with:
Zack Arnold, Editor/Director
Zack Arnold has been a professional editor for 15 years, most recently on the TV series âBurn Notice.â He made his directorial debut on the documentary: âGO FAR: The Christopher Rush Story.â Zackâs latest venture is founding âFitness In Post,â an online resource and community built specifically for people in the post-production industry who want to live a healthier lifestyle but donât know where to start. This week, he tells us more about it.
Jonathan Handel, Entertainment/Technology Attorney & Labor Reporter, TroyGould and The Hollywood Reporter
SAG/AFTRA is claiming âsignificant gainsâ in the new contract just ratified. Jonathan Handel, Entertainment/Technology Attorney and labor reporter for âThe Hollywood Reporterâ gives us the details.
Marty Lafferty, CEO, DCIA (Distributed Computing Industry Association)
The DCIA (Distributed Computing Industry Association) is an international trade organization focused on commercial advancement of cloud computing and related technologies, particularly as they are deployed for the delivery of high-value content. Marty Lafferty, CEO of DCIA, joins us this week to explain how big data, globalization and cloud computing is dramatically changing telecommunications.
With nearly 91,000 vessels, the global maritime industry crosses social, economic and geographic frontiers, but it has not yet crossed the data boundary by embracing big data. With connectivity options and speeds improving, ships are beginning to join a data revolution that promises efficiency and cost savings. However, the question remains: what will be the …
The laws of intestacy are the same for men and women even though preferences for how one's estate should be divided differ by gender. Peanut-allergic octogenarian men and gluten-allergic pregnant women see the same warnings on consumer products even though they are interested in seeing information that is much better tailored to them. Companies have made enormous strides in studying and classifying groups of consumers, and yet almost none of this information is put to use by providing consumers with contractual default terms or disclosures that are tailored to their preferences and attributes. This lecture will explore the costs and benefits of personalizing various parts of American law and business practices. This talk was recorded on April 7, 2014. Lior Strahilevitz is Sidley Austin Professor of Law at the University of Chicago Law School.
Microsoft have announced the launch of 6 new MCSA certifications and 1 new MCSE certification. This demonstrates Microsoftâs commitment to a growing Azure, Big Data, Business Intelligence (BI) and Dynamics community. These new certifications and courses will support Microsoft partners looking to upskill and validate knowledge in these technologies.
Following the huge changes announced in September, these new launches will simplify your path to certification. They'll minimise the number of steps required to earn a certification, while allowing you to align your skills to industry-recognised areas of competence.
This blog will outline the new certifications Microsoft have announced, focusing on the technologies, skills and job roles they align to.
So what's new?
MCSA: Microsoft Dynamics 365
This MCSA: Microsoft Dynamics 365 certification is one of three Dynamics 365 certifications launched. It demonstrates your expertise in upgrading, configuring and customising the new Microsoft Dynamics 365 platform.
There are currently no MOCs aligned to this certification. We have developed our own Firebrand material that will prepare you for the following two exams needed to achieve this certification:
MB2-715: Microsoft Dynamics 365 customer engagement Online Deployment
MB2-716: Microsoft Dynamics 365 Customization and Configuration
This certification will validate you have the skills for a position as a Dynamics 365 developer, implementation consultant, technical support engineer or system administrator.
This certification is a prerequisite for the MCSE: Business Applications.
20764: Administering a SQL Database Infrastructure
The second part of this course, of which there is currently no MOC, will cover Firebrand's own material.
To achieve this certification youâll need to pass the following exams:
70-764: Administering a SQL Database Infrastructure
MB6-890: Microsoft Development AX Development Introduction
Earning this cert proves you have the technical competence for positions such as Dynamics 365 developer, solutions architect or implementer.
Just like the MCSA: Microsoft Dynamics 365, this certification is also a prerequisite to the new MCSE: Business Applications certification.
MCSE: Business Applications
Earning an MCSE certification validates a more advanced level of knowledge. The MCSE: Business Applications certification proves an expert-level competence in installing, operating and managing Microsoft Dynamics 365 technologies in an enterprise environment.
In order to achieve this certification youâll be required to pass either the MCSA: Microsoft Dynamics 365 or the MCSA: Microsoft Dynamics 365 for Operations. Youâll also be required to choose one of the following electives to demonstrate expertise on a business-specific area:
Earning your MCSE: Business Applications certification will qualify you for the roles such as Dynamics 365 developer, implementation consultant, technical support engineer, or system administrator.
MCSA: Big Data Engineering
This MCSA: Big Data Engineering certification demonstrates you have the skills to design and implement big data engineering workflows with the Microsoft cloud ecosystem and Microsoft HD Insight to extract strategic value from your data.
On this course youâll cover the following MOCs:
20775A: Perform Data Engineering on Microsoft HDInsight â expected 28/6/2017
20776A: Engineering Data with Microsoft Cloud Services â expected 08/2017
And take the following exams:
70-775: Perform Data Engineering on Microsoft HD Insight â available now in beta
70-776: Engineering Data with Microsoft Cloud Services â expected Q1 2018
This course is aimed at data engineers, data architects, data scientists and data developers.
This course will teach you the skills in operationalising Microsoft Azure machine learning and Big Data with R Server and SQL R Services. You'll learn to process and analyse large data sets using R and use Azure cloud services to build and deploy intelligent solutions.
This certification covers the following MOCs:
20773A: Analyzing Big Data with Microsoft R â in development, expected May 2017
20774A: Perform Cloud Data Science with Azure Machine Learning â in development, expected June 2017
To achieve this certification youâll be required to pass the following exams:
70-773: Analyzing Big Data with Microsoft R â available now in beta
70-774: Perform Cloud Data Science with Azure Machine Learning â available now in beta
Big Data â¯ âBig data offers a lot of opportunities to the few companies who use them. However, one main reason why a larger percentage of the corporate world is yet to embrace big data is because of...
[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The transformative promise of Big Data Analytics is to generate actionable insights from massive amounts of constantly evolving data, and to then leverage those insights to achieve positive, meaningful business and societal outcomes. Â Listen as Ayataâs SVP of Sales and Marketing, Daniel Mohan, discusses how Ayata is helping top-performing operators in unconventional plays frustrated by
David J. Garrow'sRising Star: The Making of Barack Obama(New York: William Morrow, 2017) is a big book.Its ten chapters of narrative occupy 1078 pages; the remaining 383 pages consist of the acknowledgement (1079-1084), the copious chapter notes (1085-1356), the bibliography (1357-1391), the index (1393-1460) and the "About the Author" page (1461).Are so many pages needed to cover the life of Barack Hussein Obama II from August 4, 1961 to January 19, 2017?Yes.Do so many pages adequately provide full disclosure of Obama's rise as our most noteworthy Kenyan American and 44th President?No.A single bookcan't possibly give us all the contextualized facts we either need to know or think we need.A trenchant analysis of anything in our everyday lives, especially of major figures and events in American politics, requires a crunching of big data and the writing of persuasive narratives.RisingStar is Garrow's effort to make a compelling statement about our rage for social, political, and cultural information.His success, however, compounds the difficulty of knowing what is truly necessary and sufficient.
Reading Rising Star cover to cover is probably not the path many readers will take.They will sample chapters and depend on the index to guide them to topics which seem to be of immediate relevance.Unlike their nineteenth-century ancestors, most contemporary readers lack the patience and discipline to engage a big book ---unless the book pertains directly to a job, career advancement or retrofitting, and a paycheck.Even for readers who work in the arena of politics, policy decisions may be of greater importance than expanding their sense of history.Rising Star will be relegated to a shelf of reference books and consulted only when a search engine doesn't provide immediate access to specialized information or "factoids" about President Obama and his eight years in office.
We can anticipate that Rising Star will eventually appear on the collateral reading lists for advanced graduate courses in American government, political theory, historiography, orthe politics of race.Special, limited audiences of teachers and students will explore Garrow's artistry in aligning snapshots of Obama the man (organic human being) with formal photographs of Obama the president (the fashioned or constructed political being).They will be positioned to make sense of Garrow's pragmatic coup de grÃ¢ce :
In Springfield too a perceptive woman understood how Barack "is an invention of himself."But it was essentialto appreciate that while the crucible of self-creation had produced an ironclad will, the vessel was hollow at its core. "You didn't let anyone sneak up behind you to see emotions --like hurt or fear ---you didn't want them to see," Barack long ago had taught himself, yet hand in hand with that resolute self-discipline came a profound emptiness. (1078) [my italics]
Irony of irony that what is imagined to be hollow and empty will in time be seen to be solid and full. We shall need yet another 1461 pages to begin to understand the quintessential American irony that Garrow invites us to ponder.
/// Interesting experience for Ubiquity at Â Money 20/20 Europe event in Copenhagen. Money 20/20 Europe, the most important fintech event in Europe, has been designed to bring together all stakeholders with a role to play in the trade revolution: payment and financial services providers, banks, the mobile ecosystem, the retail industry, marketing services, big data … Continue reading Ubiquity in the future of Banking: our experience at Money 20/20 Europe
When Orange is the New Black, House of Cards, and Crown became mega-hits for Netflix, many people credited the analytics capabilities of the company. Mining the customer data had enabled the firm to project the type of original programming that would be highly successful. By this logic, Netflix would achieve a lower failure rate on new shows than the major television networks. After all, broadcasters such as CBS and NBC cancel a substantial share of their new shows each year, some after only a few episodes.
On the recent Netflix earnings call, many investors were pleased to hear about strong subscriber growth at the firm. However, some investors came away concerned about the amount of spending taking place as the firm acquires or develops new content. Moreover, some observers and analysts have expressed concern about the recent cancellations of some new Netflix original shows. Tom Huddleston Jr. reported on the company's reaction to this criticism in a recent Fortune article:
Meanwhile, also on the Monday earnings call, Netflix's chief content officer Ted Sarandos defended the company's recent cancellations of a handful of expensive, but underperforming, original series. "The more shows we have, the more likely in absolute numbers that youâll see cancellations, of course," Sarandos said. The executive compared Netflix's recent spate of cancellationsâincluding big-budget series like The Get Down and Sense8âto traditional TV networks that cancel nearly one-third of their new shows after their first seasons. Netflix, he said, has renewed 93% of its original series. With respect to the shows that Netflix opted not to renew, Sarandos argued: "If youâre not failing, maybe youâre not trying hard enough."
This quote from Sarandos raises a fascinating question. What is the "optimal" failure rate at Netflix? Surely, we would like the failure rate to be lower than the broadcast networks. We would like to see the company reaping the benefits of its analytics capabilities. At the same time, no one should want Netflix's failure rate on original programming to be zero. We want the firm to take some chances in hopes of landing some surprising breakthrough hits. Hopefully, the firm isn't simply guessing or drawing on the intuition of the "creatives" in the business. We would like to see them engaging in "enlightened" experimentation, using big data to guide them while still taking some risks. If they balance data mining and risk-taking in an effective way, the failure rate won't be zero, but it will be much lower than their broadcast and cable competitors.
âAI is akin to building a rocket ship. You need a huge engine and a lot of fuel. The rocket engine is the learning algorithms but the fuel is the huge amounts of data we can feed to these algorithms.â by Andrew Ng
Brian here again ---
We've released our Aerospike Hadoop integration, that allows you to easily use Aerospike as your Hadoop datastore. It's location-aware, so you can run MapReduce without network traffic, or you can real-time emit to Aerospike to use your insights in applications _immediately_.
Two days after launching the beta version of their first product, Soluto was anointed the winner of the hyper competitive start-up battle at TechCrunch Disrupt yesterday in New York. This is a prestigious accolade and an ideal launch pad especially for a company like Soluto which needs a large user base to perfect its product. To be clear, this was not any start-up competition, or simply the latest crop of Web start-ups, but Web start-ups that have the potential to be disruptive. I know itâs a loaded word that many will contest until they are blue in the face, but simply adding this modifier, made this more interesting and challenging than usual.I applaud TechCrunch for putting on the event, and am thrilled that my own portfolio company would take home the victory cup. So congratulations to Tomer, Ishay, Roee and the rest of the hard working Soluto team!
And for those of you who donât already know, Soluto is developing anti-frustration software. This download and accompanying service aims to lessen, if not eliminate, the frustration PC users feel when they twiddle their thumbs waiting for their computer to boot, staring at a frozen mouse cursor or rotating hourglass, or screaming in anguish when an application suddenly crashes on them.
I invested in Soluto foremost because of the strong entrepreneurs, who exhibit that rare combination technology depth and aptitude for consumer products. However, beyond the team and market potential, Soluto had a particular resonance with me because of their vision and approach. It fit squarely with my own investment roadmap around companies that leverage technology and their user base to create innovative web-based services for consumers.
My favorite motif within this âtechnology-enabled, crowd source-enhanced web serviceâ investment roadmap of mine is that of Big Data-based services. Big Data simply refers to incredibly large data sets that are too cumbersome to accumulate let alone work with and make sense of. I am not so much interested in the companies developing infrastructure solutions to manage data, but rather companies that are developing new products services based on their ability to capture big data, synthesize and analyze it, and package it into a simple, yet valuable consumer products and services.
Initially, the appeal lies with the fact that very often the data already exists, but is buried or otherwise inaccessible. Secondly, I am attracted by the idea that the product will strengthen with more use and over time creating a naturally widening lead over any aspiring competition(large or small). I am increasingly of the opinion that to be successful, in particularly out of Israel, web start-ups must either leverage strong technology and/or the power of the crowd to maintain a competitive advantage in the face of so much competition for customers and investment dollars.
All of this is far from trivial, but Soluto aims to do just this. They start with a powerful, yet very intuitive download, which serves a dual purpose of providing a free boot utility to consumers, while capturing important data anonymously, not unlike anti-virus software. This âpassiveâ crowd sourcing is valuable because Soluto has already built the backend of their service which knows how to make sense of the data for the creation of the second order product, which is the anti-frustration service. There is also âactiveâ crowd sourcing through the techie users who can easily contribute their knowledge and solutions to the product.
With a more than a billion PCs in use, most of them frustrated, the business opportunity is enormous. The intense demand explains the relative success of snake oil solutions like registry cleaners or extreme methods like repetitive reimaging. And as anti-virus increasingly becomes a commodity or outright free, anti-frustration software pioneered by Soluto, may be its natural successor.
Even though it is still early days at Soluto, I continue to look for more companies that pursue similar strategies. In fact, I hope to announce my next âtechnology-enabled, crowd source-enhancedâ web service investment soon. In the meantime, download and install Soluto!
For all its beauty and elegance, the iPhone's UI, in the state demoed on Apple's website and at Macworld, has at least two fundamental issues, even disregarding the whole touch-screen/haptics debate.
These two issues are scalability and contextuality -- a lack of both. I'll address the first issue in this entry, and the second issue in a later entry.
There are two areas where the iPhone UI will fail to scale.
1. One-touch home page can't scale
The iPhone relies on being a feature phone (not a smartphone, see my previous entry) to implement Steve Jobs's vaunted two clicks from anywhere UI functionality. If you add extra apps, for example a pedometer, a finances app, a possessions (eg. books, CDs, etc.) database, an ebook reader, a word processor, a spreadsheet, a presentation app, a dictionary, etc. etc. you will quickly run out of screen real estate.
When that happens, you have two choices:
Add a scroll bar, which makes some items three clicks away (tap the home button, tap the scroll bar or "flick scroll", tap the icon) or more
Add folders, which makes items three taps away (tap home, tap the folder, tap the icon)
Basically, there's nothing magic about Apple's "two-clicks to anywhere". It's just a result of crippleware.
2. Flick-scrolling without context reduction only works for small datasets
This one is a bigger problem for Apple. Pay careful attention to the demo of the Contacts application on the iPhone (available at http://www.apple.com/iphone/phone/). Notice that the app has no search icon or text pane. All it has is a list of contacts and the alphabet down the side.
The demo shows how cool flick-scrolling looks. What it doesn't show is how painful it would be searching through a database of 400+ contacts (which is not a big database for many users, now that people sync with their PCs). Flick-scrolling is inherently imprecise, and thus a slow way to find a single item in a large dataset (which is mostly what you want to do with a contacts database on a phone).
What's the fastest way to find a contact? Well, iContacts on the Mac actually has it built in: a filter that narrows the contacts list. It's called Spotlight, and is available on virtually every window on the Mac. However, it is conspicuously absent from the iPhone.
(The reason it's absent from the iPhone is pretty easy to guess: Spotlight isn't much chop without a keyboard to enter text, but the iPhone doesn't have any ugly plastic buttons, so if you want to enter text on it, your usable screen space suddenly vanishes away, eaten up by an ugly onscreen keyboard -- have a look at the SMS demo at the iPhone site. So filtering a large list by entering text is not something that the iPhone's form-factor is very good at.)
Unfortunately, flick-scrolling really isn't a substitute for Spotlight-style filtering for two reasons:
Flick-scrolling is imprecise. I've already mentioned that this imprecision makes navigating to a single contact a pain. It's hard to describe, and, until I've played with an iPhone, I can't be sure just how painful flick-scrolling will be, but I'm pretty sure it'll be painful. Even if flick-scrolling is magically wonderful, there's still another reason why it's vastly inferior to filtering a long list. It is telling that the contacts list in Apple's demo is pretty small.
Long lists are hard to visually search. The item you're looking for just gets lost in the midst of the huge number of items you're looking through. Humans are very good at pattern matching, but even humans get overwhelmed if there are simply too many candidates to match against, and scrolling doesn't reduce the candidate pool.
This is where filtering really makes its money: it reduces the context to the minimal, useful context. If I'm looking for my contact in my database, all the other Lithgows overwhelm it. Even if I can tap on "L", I'm still faced with a lot of distractingly similar near-matches. But if I can filter for "Malcolm", then I can remove all of them with seven touches (in fact, I can remove pretty much all of them with three or four touches: "mal" or "malc"). Then I don't have to scour the list, I simply choose the only option.
The inherent lack of this capability in the iPhone's UI will make for a frustrating experience for people who have any significant amount of data. The iPhone thus limits itself to toy status (much as the Newton did up until it's swansong with the MP 2000).
Can Apple fix this? Yes, they can, but fixing involves moving back towards standard PDA interfaces, either providing a physical keyboard (unlikely), or providing some form of touch-input for letters (there are many innovative solutions out there, check out Ring-Writer, for example).
But there are other problems with the iPhone's UI that indicate that Apple has been thinking more about glamour than substance. The major one is the lack of contextuality, and I'll be talking about that next. Stay tuned.
Supercomputer-maker Cray is helping oil and gas companies benefit from the most-advanced reservoir modeling approach yet. Called Permanent Reservoir Monitoring, or PRM, the technique requires innovative data warehousing technology and data analysis techniques.
We are looking for a Front end engineer for one of our clients, a SW Disruptive Big Data Startup based in the city center of Madrid. You would be part of an innovative team. Responsabilities: Your main responsibility will be to lead their system to next level , in terms of design, UX and architecture. Requirements: A minimum of 3 years of relevant experience building FE. Technologies: JS, CSS, Sass, FE frameworks (one of the following, at least: Angular, Ember, Vue, React),...
We are looking for a DevOps Specialist for one of our clients, a SW Disruptive Big Data Startup based in the city center of Madrid. You would be part of a team in charge of supporting the company infrastructure and the systems associated. Requirements: To hold a Bachelor Degree in Computer Science or similar. A passion for Unix/Linux (required some experience withUbuntu/Debian and MacOS). Experience with cloud. Experience managing a continuous delivery environments. Our client...
We are proud to present IBSurgeon FirstAID 5.0 – the new version of the recovery software with the highest rate of successful repairs. FirstAID 5.0 is a major improvement: now it supports Firebird 3.0, InterBase XE7, and big databases (100Gb+). Download IBSurgeon FirstAID 5.0 If you are a user of FirstAID version 3.x or 4.x, you can log into IBSurgeon
The big data world is a confusing place. We’re no longer in a market dominated mostly by relational databases, and the alternatives have multiplied in a baby boom of diversity. These child prodigies of the data scene show great promise …
Here are some of the key big data themes I expect to dominate 2013, and of course will be covering in Strata. Emergence of a big data architecture The coming year will mark the graduation for many big data pilot …
Where does all the data in “big data” come from? And why isn’t big data just a concern for companies such as Facebook and Google? The answer is that the web companies are the forerunners. Driven by social, mobile, and …
Visual Analytics Workshop at BlackHat Las Vegas 2017. Sign up today! Once again, at BlackHat Las Vegas, I will be teaching theÂ Visual Analytics for Security Workshop. This is the 5th year in a row that I’ll be teaching this class at BlackHat US. Overall, it’s the 29th! time that I’ll be teaching this workshop. Every […]
Big cities continue to be centers for innovative solutions and services. Governments are quickly identifying opportunities to take advantage of this energy and revolutionize the means by which they deliver services to the public. The governmental public health sector is rapidly evolving in this respect, and Chicago is an emerging example of some of the changes to come. Governments are gradually adopting innovative informatics and big data tools and strategies, led by pioneering jurisdictions that are piecing together the standards, policy frameworks, and leadership structures fundamental to effective analytics use. They give an enticing glimpse of the technology's potential and a sense of the challenges that stand in the way. This is a rapidly evolving environment, and cities can work with partners to capitalize on the innovative energies of civic tech communities, health care systems, and emerging markets to introduce new methods to solve old problems.
Itâs no secret that cybersecurity is the single largest challenge facing CIOs and tech leaders in 2017. With mass digitalization, exponential increases in big data volume, the growing popularity of […]
Glennâs Bio: By day Glenn Block works at Splunk making it easier for developers to work with Big Data as he drives the development of Splunkâs Dev platform. By night, Glenn is an active maintainer and contributor of several OSS projects including scripts (https://github.com/scriptcs/scriâ¦). He is a polyglot with his most recent favorite language being...
Unusual new, global big color database to aid color decision-making including competitive intelligence across industries, on-line trend "listening", color trademarks by industry and country, and global color research studies
Checking the Health of the Economy Modern society lives with Big Data and statistics, but every statistic has a story behind it. Though the United States economy is improving, there are some numbers or
I may be a couple of days late of Valentine’s Day, but there is a serious love fest between Big Data and The Internet of Things. What is Big Data? The Wikipedia says: Big data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, […]
Die Transportwirtschaft und die urbane Logistik profitieren von der Telematik. Auch Ã¼bermorgen noch? Neue AnsÃ¤tze weisen einen spannenden Weg in die Zukunft, zum Beispiel in Richtung Physical Internet. Foto: iStock/Alija
Telematik? In Zeiten von Digitalisierung, Logistik 4.0, Big Data, E-Commerce und Echtzeit-Avisierung wirkt dieser Begriff veraltet.… Read more
The amount of data being generated is increasing by orders of magnitude year-over-year. Traditionally, this hasn't affected organizations much as they have the data that they are required to keep, the primary data that ties ...
Babbel, the online learning system for foreign languages, today announced the closing of a series B funding round. Leading the round is Reed Elsevier Ventures. Other investors include Nokia Growth Partners as well as existing investors, IBB Beteiligungsgesellschaft via its VC Fonds Technologie Berlin, and Kizoo Technology Ventures. The investment will be used to accelerate international expansion and improve the adaptation to all relevant mobile and online platforms.
Present in more than 190 countries, the Berlin-based startupâs strongest footprint to date has so far been in the German market. Now it will aggressively enter other European countries, the Americas and emerging markets. Babbel will also extend its partnerships with different hardware manufacturers, platform providers and media across the world.
Babbel.com is operated by Lesson Nine GmbH, Berlin. The company had previously raised a total of $2.2M in equity and debt and has experienced rapid revenue growth of over 200% per year since 2011. Recently, Lesson Nine announced the [acquisition of San Francisco based competitor PlaySay] (http://press.babbel.com/en/releases/2013-03-21-education-startup-babbel.com-acquires-san-francisco-based-playsay.html). Unrelated to the new investment, the deal was made with operating cash flow.
The basis for the language learning systemâs success is its consistent use of mobile and Internet technologies and the integration of modern, practical learning content that motivates and guides the learner in an entertaining way. Over 6500 tailor-made learning hours for thirteen languages are available to learners online, as an iPad app and as free vocabulary training apps for iPhone, Android and Windows 8. As of today, the apps have been downloaded over 8 million times.
âBabbel is a European digital media success story and I am delighted that we are joining the investor group at this exciting timeâ, says Tony Askew, General Partner at Reed Elsevier Ventures. âThe startup has grown rapidly to over 15 million users and has built a large subscriber base which generates positive cash flow. Babbelâs excellent mobile and online products consistently rate as consumer favorites and Babbel is very well-positioned for explosive growth in the rapidly growing category of mobile and online language learning.â
âNokia Growth Partners believes that in a converged digital world, every business must be mobile and this principle drives our investments,â adds Walter Masalin, principal at Nokia Growth Partners. âAs mobile transforms the way people learn, Babbelâs flexible and efficient solution supporting multiple platforms means it is well positioned to capitalize on this trend.â
âSince our investment in 2008, Lesson Nine was already able to successfully occupy various markets with its innovative products, and has established itself worldwide as a serious player in the realm of mobile language learningâ, says Marco Zeller, Managing Director of IBB Beteiligungsgesellschaft mbh. âThis funding round, including other international investors, honors the Berlin companyâs extremely positive development, and creates a foundation for even more dynamic growth. We are proud to have been on board with this success story since the beginning, and also to provide more capital as part of this round.â
Michael Greve, CEO of Kizoo Technology Ventures says, âsince we started working with Babbel five years ago, the product has made an exciting journey from a nice web tool to a modern and fun language learning experience with a huge user base eager to subscribe for the service. I believe the ideal platform for language learning is tablets and we can expect an even more accelerated growth of the beautiful mobile Babbel products in the future.â
âWe are happy to have two new high-profile international investors on board. This financing round opens a great number of opportunities without limiting our strategic options. The renewed participation of existing investors IBB Beteiligungsgesellschaft and Kizoo also pleases me. For our great team of seventy people, thereâs still much to be done and much to achieve,â says Markus Witte, CEO of Lesson Nine GmbH.
Babbel is the new way to learn languages. With the online language learning system, both beginners and continuing learners can study French, Spanish, Italian, Brazilian Portuguese, Swedish, German, Dutch, Indonesian, Polish, Turkish, Norwegian, Danish and English with the help of interactive listening, writing and speaking exercises. The website babbel.com offers numerous online courses. In addition there are apps for iPad, iPhone, iPod, Android and Windows 8 devices, as well as interactive eBooks. More than 15 million people from over 190 countries are already learning a language with Babbel.
Babbel is operated by Lesson Nine GmbH, Berlin. The company was founded in August, 2007, and now has around 170 employees and freelancers. Since March, 2013, Lesson Nine has been involved with Reed Elsevier Ventures, Nokia Growth Partners, Kizoo Technology Ventures and IBB Beteiligungsgesellschaft via its VC Fonds Technologie Berlin. Further information at: http://www.babbel.com
About Reed Elsevier Ventures:
Founded in 2000, Reed Elsevier Ventures is a venture capital firm based in London and San Francisco and backed by one of the worldâs most successful media and information companies, Reed Elsevier. Reed Elsevier Ventures invests in talented and ambitious entrepreneurs and management teams who have the drive to build large, scalable businesses and the determination to become industry leaders. Reed Elsevier Ventures focus on high growth, internet, media and technology businesses based in the US, Europe or Israel in sectors such as big data and analytics, mobile, new media, healthcare information and groundbreaking analytic technologies. Example portfolio companies include Palantir, one of Silicon Valleyâs most valuable technology companies, and Babylon, the worldâs most downloaded language translation tool.
About Nokia Growth Partners:
Nokia Growth Partners invests in companies that are changing the face of mobility, communications and the internet. NGP offers industry expertise, capital and an extensive network, enabling entrepreneurs to build disruptive, industry-changing companies and take them to the global market. With offices in the US, Europe, India and China, NGP extends the reach of companies making their products and services local everywhere. Visit http://www.nokiagrowthpartners.com/ for more information.
About IBB Beteiligungsgesellschaft mbH:
The IBB Beteiligungsgesellschaft provides venture capital to innovative Berlin enterprises and has established itself as a market leader in the field of early stage financing in the location Berlin. The funds are used primarily for the development and market launch of innovative products or services, as well as for business concepts of creative industries. Currently two of the funds managed by the IBB Beteiligungsgesellschaft are in the investment phase, the VC Fonds Technologie Berlin with a fund size of â¬ 52 million and the VC Fonds Kreativwirtschaft Berlin with a fund size of â¬ 30 million. Both VC funds are financed by means of the Investitionsbank Berlin (IBB) and the European Fund for Regional Development (EFRE) administered by the State Berlin. Since 1997 the IBB Beteiligungsgesellschaft Berlin, in consortia with partners, has made 850 million â¬ available to creative and technology-orientated companies; thereof, the portion invested by IBB Beteiligungsgesellschaft itself, as lead, co-lead or co-investor, was approximately 116 million â¬.
About Kizoo Technology Ventures:
Kizoo helps young start-up teams grow. As a seed and early stage investor with a focus on SaaS, Internet & Mobile Services and Social Applications Kizoo is happy to share its own longtime experience in development, marketing and product management in those markets.â¨ www.kizoo.de
Precisamos un Ingeniero/a Big Data. Funciones: Asegurar que la plataforma diseÃ±ada cumple las expectativas y requerimientos de la organizaciÃ³n en cuanto al ensamblaje de modelos analÃticos propios y externos. Participar en el dimensionamiento ,diseÃ±o y desarrollo de las arquitecturas, en funciÃ³n de los datos y necesidades analizadas, basadas en las distribuciones: AWS, Cloudera, Hortonworks, Mapr, ... AnÃ¡lisis predictivo y minerÃa del dato: modelos analÃticos y motores de...
Data Scientist: come valorizzare i dati. L’arte di sfruttare lâasset dei Big data come fonte per sperimentare e innovare i modelli di business dei Brand e favorire la creazione di valore aggiunto. Analisi e riflessioni sulla figura emergente del Data Scientist. Un articolo ricco di informazioni e curiositÃ architettate con il supporto pratico e rilevante […]
When I was a graduate student back in the dark ages, I took an advanced statistics course and then briefly worked in a laboratory where statistical analysis of data derived from animal models of disea...
click here to continue
Yep, excellent set of topics. We have moving past the 3Vs of Big Data to the Three Amigos of Big Data - Interface, Intelligence & Inference. http://goo.gl/NqOWQ
Couple of observations : Wearable Devices is missing from the list. Deep Learning definitely is an emerging topic, something close to my heart. I think Analytic Sandbox supported by a Data Landing Zone is more accurate than convergence of databases. Lastly the larger picture is the Big data pipeline spanning Data management & Data Science viz. Collect-Store-Transform-Model-Reason-Visualize/Predict/Recommend-Explore.
Patrick Schwerdtfeger is available to speak in Bangalore as a leading authority on technology (including ‘big data’ and predictive analytics) and global business trends (including demographics, particularly in south Asia). He’s the author of the award-winning book Marketing Shortcuts for the Self-Employed (2011, Wiley) and a regular speaker for Bloomberg TV. If you are […]
Procesy automatyzacyjne i wieloaspektowa analiza danych w oparciu o Big Data pozwalajÄ firmom kontrolowaÄ procesy zwiÄ zane zarÃ³wno z przepÅywem towaru, jak i transakcjami gotÃ³wkowymi. WciÄ Å¼ jednak do caÅkowitej automatyzacji nam daleko. Dlatego warto pamiÄtaÄ o potrzebach pracownikÃ³w, ktÃ³rzy, bez odpowiedniego wÅÄ czenia w nowoczesne procesy, mogÄ pozostaÄ na marginesie przemysÅowej rewolucji - PrzemysÅ spoÅ¼ywczy. Biznes 4.0 - nowe technologie czynnikiem konkurencyjnoÅci. CzÄÅÄ II
As an analyst, I often see new Big Things coming by. Things like cloud, big data, mobility, even going back to client-server. The reaction of established vendors to these Big Things is pretty predictable, even depressingly so. Here is what it looks like, where X = whatever is new. X will never work No one […]
Julie Johnson, a business associate of mine, posted this piece on her blog about the 'What else?' question in a coaching conversation:
Imagine that you are coaching someone, and you both agree that it is time to focus on generating possible solutions to the challenge at hand. So you ask your coachee: âHow can you achieve this goal?â Without any hesitation, you receive an answer. What do you do next?
Letâs take a case in point. A few years ago I was coaching someone who wanted to get better at giving strategic presentations, especially to senior management. We had already explored what had gone well and less well in the past, conditions that have an impact on performance, the advantages to achieving the goal and the disadvantages of not achieving it. By this point, his motivation was solidly in place. With both of us keen to get to solutions, the conversation went something like this:
I asked, âWhat can you do to improve your presentations skills when presenting to senior management?â My coachee quickly replied, âI can take a course.â
Tempted to explore possible courses, and whether there was a budget for a course available, etc. etc., I simply made note and asked, âWhat else?â He quickly replied, âI can get a presentation coach.â
I thought about exploring the qualities of the ideal presentation coach, but didnât. Instead I inquired, âAnd what else can you do?â There was a slight pause, and then he answered, âWell, I could go on YouTube and check out the techniques of some of my favorite speakers. [pause] And TedTalks. Mmm. Iâd like that.â
I noted once more, and then said, âWhat other things might you do?â There was a significant pause, during which he looked out the window. Then he said, âDavid. He is quite good. Iâd love to have coffee with him and pick his brain. [pause] And I really need to watch him more consciously when he presents next time, and figure out what it is he is doing exactly that works so well.â
âMmmm.â I said, noting these new ideas. âAnd what else would work for you?â This pause was even longer, and I waited. Finally he said, âWell, a couple of my team members have attended some senior management meetings, and theyâve seen me in action. I bet they would be happy to give me candid feedback and suggestions.â
Tempted to ask who he might speak with and what questions he might ask, I just said âOk. Anything else?â After a very, very long silence, he said âWell frankly, if I am really serious about this, I should practice my next presentation several times before I actually have to give it. [pause] I could even film myself. Yes! Yes! It would be so useful to observe myself in action! Then, when I finally like what I see, I will have the confidence to do a repeat performance when it really matters!â
When he was out of ideas, we reviewed each option he had generated, and then moved eagerly on to action planning.
While some of those post-question silences were pretty long, I donât even think that my coachee noticed them. He was very busy creating. His first ideas were probably not new, because his answers came immediately after the question was posed. But because I kept asking the same question (with different words) over and over again, his mind kept creating, and the pauses between question and answer got longer and longer.
My general guideline in these situations is âThe longer the silence, the newer the idea.â There are two things to avoid once you have carefully crafted this creative moment:
Donât grab one idea and analyze it in detail â leave that for later once all the ideas are on the table.
Keep in mind that the longer the silence after your question, the harder your coachee is probably thinking, and therefore creating. If your question is followed by silence, you are probably âon a rollâ! This is the best confirmation that your question is a good one!
I like the narrative style. It's practical and gives a good insight into the judgements that a coach continually has to make about the nature of their interventions. I shared the blog with some managers that I had been working with on coaching practices the day after the blog was posted and I received this response from one of the recipients shortly afterwards:
Thanks John. I actually used this approach with one of my team â it worked brilliantly, and almost as set out above.
...nice feedback and evidence of the importance of sharing ideas and practices.
Coaching practice - what else do we need to consider?
Through my masters studies I looked in some detail at the field of conversation analysis and ethnomethodology and the structuring of sense-making that is part of everyday conversational interaction. If you are interested in following this in more detail please go to my blog Observing Practice.
Of particular interest in Julie's account is the description of the silences. What's noticeable is the work that's going on in the silences.
The 'what else?' question is an effective device to stimulate thought and the skill of the coach is to hold the pause to allow the thinking work to develop - "There was a significant pause, during which he looked out the window". However, what we can't tell from this is what the coach was doing whilst the coachee was looking out of the window. My expectation is that the coach was helping to maintain the silence by following good listening practices like maintaining eye contact and avoiding non-verbal gestures or movements that might distract the silence. My point is that these practices are being taken-for-granted but they are as much a part of the ordering process to accomplish an effective coaching intervention as the powerfulness of the question itself.
Why is this important?
Coaching is an important approach to helping facilitate change in individuals and teams. However, like other management practices, it has a kind of mysterious 'black box' quality to it; coaching is not accomplished through a model or a set of questions or behaviours but through a choreography of fine-grained actions that emerge situationally each and every time a coach works with an individual or a team. In other words, however good 'what else?' is as a question - and it's one that I often use too when I'm coaching - it rather glosses over a lot of important but unseen work that is also contributing to the result.
The challenge of capturing everyday action
My continuing interest in conversation analysis is the opportunity that it offers to study the choreography and use it as a learning tool to enhance the development of coaches and coaching practice.
The challenge is that this choreography slips by too quickly and is too nuanced; and to be of use the action would need to be captured on video or audio and then analysed to produce a micro level detail of practice. I know from personal experience that the analysis takes a great deal of time and, at present, is too onerous to be of practical use.
However, I am hopeful. Big data technologies are now emerging that are able to provide information about, inter alia, workplace interactions - for an example see the HBR blog The new science of building great teams and the use of electronic badges to gather interactional data. This type of analysis looks very promising and is producing some ground-breaking insight into how people interact. It forms part of some work that is being described as social physics that is coming out of MIT. I'm reading up on this at the moment and will write a further post of my sensing making on this topic.
In my previous post, I mentioned that Oracle Big Data Cloud Service – Compute Edition started to come with Zeppelin 0.7 and the version 0.7 does not have HIVE interpreter. It means we won’t be able to use “%hive” blocks to run queries for Apache Hive. Instead of “%hive” blocks, we can use JDBC interpreter […]
Last week, Oracle Big Data Cloud Service – Compute Edition was upgraded from 17.2.5 to 17.3.1-20. I do not know if the new version is still in testing phase and available to only trial users, but sooner or later the new version will be available to all Oracle Cloud users. The new version is still […]
Why do so many companies make bad decisions, even with access to unprecedented amounts of data? With stories from Nokia to Netflix to the oracles of ancient Greece, Tricia Wang demystifies big data and identifies its pitfalls, suggesting that we focus instead on "thick data" -- precious, unquantifiable insights from actual people -- to make the right business decisions and thrive in the unknown.
Barcelona, Spain & Bengaluru, India, August 27th 2015
Openbravo, the provider of the preferred Commerce Suite and Business Suite, today, announced a strategic partnership with Happiest Minds Technologies, a next generation Digital Transformation, Infrastructure & Security and Product Engineering Services Company.
"Openbravo welcomes the strategic partnership with Happiest Minds and both organizations are engaging in a great working relationship to bring cutting-edge technological advancements to Openbravo's retail offerings. Our global presence plus the importance and international reach of the Openbravo Commerce Suite clients in the retail space makes us uniquely positioned to work together with a partner with the stature of Happiest Minds. We believe that Retail Customers, several of which are true market leaders, can benefit greatly from this alliance", said Marco de Vries, CEO, Openbravo.
"We were looking for the right partner to build a unique proposition for retailers to help them rapidly improve their customer buying experience within an OmniChannel framework. With Openbravo, we found the right solution at a competitive price and flexibility to extend it to provide a differentiated offering to our Retail Customers ", said Salil Godika, Chief Strategy & Marketing Officer and Retail Industry Group Head of Happiest Minds.
"Happiest Minds and Openbravo are focused on the OmniChannel framework where the idea is to enhance the overall performance of a retailer and help its Retailer Customers bring the consumer back into their stores by providing a unique experience. We are working together in areas such as OmniChannel Fulfillment, Retail IoT and Advanced Analytics" - Said Sunando Banerjee, Channel Business Manager at Openbravo.
The Openbravo Commerce Suite is a multichannel retail business solution built on top of a truly modular, mobile-enabled and cloud-ready technology platform. The platform allows retailers to transform their physical store channel and do more and faster, with lower risks.
Openbravo is a world leader in the commercial open source software space helping midsize to large organizations in 60+ countries around the globe successfully manage continuous change and innovation by providing business management solutions that deliver a high degree of agility, responsiveness and usability, including a state-of-the-art multichannel retail solution, the Openbravo Commerce Suite, and a global management solution, the Openbravo Business Suite, both built on top of a highly flexible and extendible platform that allows companies a greater focus on differentiation and innovation.
Openbravo solutions are exclusively distributed through a network of Official Openbravo Partners. Openbravo has offices in India, France, Mexico and Spain.
About Happiest Minds Technologies
Happiest Minds enables Digital Transformation for enterprises and technology providers by delivering seamless customer experience, business efficiency and actionable insights through an integrated set of disruptive technologies: Big Data Analytics, Internet of Things, Mobility, Cloud, Security, Unified Communications, etc. Happiest Minds offers domain centric solutions applying skills, IPs and functional expertise in IT Services, Product Engineering, Infrastructure Management and Security. These services have applicability across industry sectors such as retail, consumer packaged goods, e-commerce, banking, insurance, hi-tech, engineering R&D, manufacturing, automotive and travel/transportation/hospitality. Headquartered in Bangalore, India, Happiest Minds has operations in the US, UK, Singapore, Australia and has secured $ 52.5 million Series-A funding. Its investors are JPMorgan Private Equity Group, Intel Capital and Ashok Soota.
NIH Director’s Blog, 10 November 2015 In recent years, thereâs been a lot of talk about how âBig Dataâ stands to revolutionize biomedical research. Indeed, weâve already gained many new insights into health and disease thanks to the power of new technologies to generate astonishing amounts of molecular dataâDNA sequences, epigenetic marks, and metabolic signatures, to name […]
In the digital world where billions of customers are making trillions of visits on a multi-channel marketing environment, big data has drawn researchersâ attention all over the world. Customers leave behind a huge trail of data volumes in digital channels. It is becoming an extremely difficult task finding the right [...]
Big Data: As Google celebrates its 10th anniversary, we find out how science is coping with massive datasets generated by unprecedented computing power. BoingBoing blogger Cory Doctorow tells us about his visits to the LHC data storage facility and the genome sequencing Sanger Centre.
Gary Marcus of New York University talks with EconTalk host Russ Roberts about the future of artificial intelligence (AI). While Marcus is concerned about how advances in AI might hurt human flourishing, he argues that truly transformative smart machines are still a long way away and that to date, the exponential improvements in technology have been in hardware, not software. Marcus proposes ways to raise standards in programming to reduce mistakes that would have catastrophic effects if advanced AI does come to fruition. The two also discuss "big data's" emphasis on correlations, and how that leaves much to be desired.
William Kassler, MD, MPH Deputy Chief Health Officer Lead Population Health Officer IBM Watson Health TUESDAY, AUGUST 8, 2017 12:00-1:00 PM (LUNCH PROVIDED FOR FIRST 50 GUESTS) MAURER CENTER FOR PUBLIC HEALTH ROOM 3013 We are poised in a new era of technology and big data. This presentation will review sophisticated efforts that build on increased availability of big data and improved analytics, and will describe how these efforts support integration of personal, practice, system, and community programs to improve individual and population health. Objectives:... Continue Reading
I coauthored my 15th book Together with Christian Cote (lead author) and Matija Lah (coauthor) we publishes SQL Server 2017 Integration Services Cookbook. Of course, it is kind of early to say this is a definitive guide to SSIS 2017. More accurate name would be SSIS 2016 / 2017 Cookbook. Besides detailed guidelines how to use the 2016 version, you will also find a chapter on some new information on scaling out SSIS 2017. In the future, we will add an online chapter, if it will be needed, about additional new SSIS 2017 functionalities. Anyway, here is a brief description of the chapters.
Chapter 1: SSIS Setup
This chapter will describe step by step how to setup SQL Server 2016 to get the features that are used in the book.
Chapter 2: What is New in SSIS 2016
This chapter is an overview of Integration Services 2016 new features. Some of the topics covered here are covered extensively later in the book.
Chapter 3: Key Components of a Modern ETL Solution
This chapter will explain how ETL has evolved over the past few years and will explain what components are necessary to get a modern scalable ETL solution that fits the modern data warehouse.
Chapter 4: Data Warehouse Loading Techniques
This chapter describes many patterns used when it comes to data warehouse (DW) or operational data store (ODS) load.
Chapter 5: Dealing with Data Quality
This chapter will describe how SSIS, DQS and MDS can be leveraged to validate, cleanse, maintain, and load data.
Chapter 6: SSIS Performance and Scalability
This chapter talks about how to monitor SSIS package execution. It also provides solutions to scale out processes by using parallelism. Readers learn how to identify bottlenecks and how to resolve them using various techniques.
Chapter 7: Unleash the Power of SSIS Script Task and Component
Readers learn how script tasks and script components are very valuable in many situations to overcome the limitations of stock toolbox tasks and transforms.
Chapter 8: SSIS and Advanced Analytics
This chapter talks about using SSIS to prepare data for and do advanced analyses like data mining, machine learning, and text mining. Readers learn how sampling components can be used for preparing the training and the test set, how to use SQL Server Analysis Services data mining models, how to execute R code inside SSIS, and how to analyze texts with SSIS.
Chapter 9: On-Premises and Azure Big Data Integration
This chapter talks about the Azure Feature pack that allows SSIS to integrate Azure data from blob storage and HDInsight clusters. Readers learn how to use Azure feature pack components to add flexibility to their SSIS solution architecture.
Chapter 10: Extending SSIS Custom Task and Transformations
This chapter talks about extending and customize the toolbox using custom developed tasks and transforms.
Chapter 11: Scale Out with SSIS 2017
The last chapter is dedicated to SSIS 2017 and teaches you how to scale out SSIS package executions on multiple servers.
We seek a biologist who has expertise in analysis of big data, modeling, bioinformatics, genomics/transcriptomics, biostatistics, or other quantitative and/or... From University of Richmond - Thu, 06 Jul 2017 23:17:18 GMT - View all Richmond, VA jobs
MCHâs proprietary email scoring solution, eRespond, uses real-time data to rank and score K-12 educators in institutions most likely to respond to email offers. This gives marketing professionals the...
Of course many readers, technologistsand businesses are acutely aware of the value prop of the cloud but what aboutthe rest of the "majority" of people who don't? If you do a search onthis topic, you will mostly get links to marketing materials that are fairlyhigh level and full of the usual buzzwords with more links to lengthy casestudies. As a business technology enabler, there needs to be a simpler messageor elevator pitch that is easy to understand yet convincing. Ah, the simplicityor less is more approach rears itâsbeautiful head once again. Below are some recipes to draw from for an elevatorride of any duration.
The Enterprise Cloud Reduces Business Outages Increasing Customer Satisfaction
Put simply and avoiding technical acronyms, can the businessrely on 99.99+% availability from the cloud? Look no further than Netflix as an example becauseyes the cloud is certainly up to the task.
Immediate Availability Gives You a Your Competitive Edge (for now)
Back in the day, Dell drove industry change by assemblingcomputers on demand reducing inventories and wrecking havoc on computerstorefronts. Well the cloud equivalent is to be able to go to the cloud store,answer a few questions, enter your credit info and within 1 hour or less youare up and running.
A Right Sized Business Reduces Time, Headaches and Cost
You may wonder what this is? It takes many forms but to meyou have to ask does the business get what it paid for? No more shelf ware andcomplicated upgrade processes. You need more capacity for certain businesscycles, zoom itâs there. You need more storage, of course that is automated.You think Apple and other retailers have to turn on any switches to increasecapacity? Maybe they do now but they shouldnât.
Business Friendly User Interfaces Decreases Timeto Market
Your thinking mobile and that is certainly true but it ismuch more than that. This is the type of disruptive technology that could puttechnologists like me out of work. Non-technical end users must be able tobuild and deploy apps that previously required a CS degree and many years ofexperience. Consider how Smart Data cloud initiatives are disrupting Big Data.Nate Silver, beware.
Build a Nimble Business by Thinking SmallThe cloud is all about modularity, extensibilityand continuous release cycles. A/B testing drives micro feature planningrendering the traditional roadmap virtually useless. In a fully optimizedenvironment, features will show up before you even have a chance to request them.Social tools will drive this change more than traditional communicationchannels and you have to be on top of it because your competition sure is.
Focus on Business but Engage Developers and IT to Modernize
Remember, we are all in this new delicate cloud eco systemtogether so it is best to engage all parties even as you try to do an endaround them. Developers will be crucial allies in building a hybrid OnPremcloud solution and IT will help you track outages on cloud systems just likethey always have for internal systems.
Be Business Tech Savvy to Extend Your Brand
You may be asking yourself, how does this extend the brand?Think huge retailer with leading cloud PaaS market share. Make a point ofunderstanding these new disruptive forces of nature. Understand how NodeJS andNPM drive modularity and manage dependencies that reduce business risk. UtilizeGitHub to research and rate technologies that could impact and shorten your time tomarket. Donât be passive, get a free developer account on Koding.com toexperience a PaaS system and write a Hello app in 5 different languages. Usethe force, do a lot of âWhat is XYZâ searches.
Transform your business but donât forget to have some fun.
Las operaciones quirÃºrgicas realizadas con el menor impacto posible sobre el cuerpo estÃ¡n cambiando la medicina para siempre. Robots de precisiÃ³n, big data, previsualizaciÃ³n hologrÃ¡fica y realidad virtual son algunos de los paradigmas tecnolÃ³gicos clave para este tipo de intervenciones que reducen drÃ¡sticamente los riesgos y dolores tiempos de recuperaciÃ³n para los pacientes.
As we have seen in recent news headlines, security breaches can bring entire organizations, states and countries to their knees. In today's connected world, making security a top priority is no longer a choice - it's a must. As public and private organizations continue to operate within this new era of the Internet, security will become critical to maintaining trust with the public, building company reputation, as well as safeguarding data, IP and critical infrastructure.
California is at the center of the digital revolution that is shaping the world around us. Already a national center of commercial cybersecurity activities, California is home to companies building the cybersecurity products and solutions that are securing commercial businesses, academic institutions and governmental organizations across the globe.
In an effort to help advance the goals and promote the accomplishments of the Governor's Cybersecurity Task Force, CyberTECH, among other state and local leaders, recently launch CyberCalifornia.
CyberCalifornia will organize public-private partnerships in cybersecurity, with the goals of facilitating research and innovation in cybersecurity, educating California businesses about cybersecurity needs and resources, and connecting California's robust workforce development system with the needs of California employers.
Led by its Board of Advisors, CyberCalifornia activities include: Assisting in the organization of private sector advisory groups by vertical industry such as banking and finance, high technology, agriculture, etc. Assisting in the development and promotion of cybersecurity career pathways Partnering with local and regional economic development organizations to inform California's small business community about cybersecurity needs and solutions Establishing connections between the cybersecurity and Internet of Things sectors through activities such as conferences and media events
To learn more about CyberCalifornia, please contact email@example.com.
Darin Andersen, CEO, CyberUnited, Co-Chair, CyberTECH, Co-Chair, Economic Development Subcommittee, California Cybersecurity Task Force
"CyberCalifornia: Cybersecurity and IoT Gold Rush"
Recently, CyberTECH helped launch CyberCalifornia with other State and local leaders. The initiative is organized in conjunction with the Innovative Hub (iHub) Network, a program administered by the State Office of Economic Development and in partnership with Governor Brown's Cybersecurity Task Force.
Jerry Brown, Governor of California
CyberCalifornia will organize public-private partnerships in cybersecurity to better protect California's critical infrastructure, businesses and citizens from cyber threats, facilitate research and innovation in cybersecurity, educate California businesses about cybersecurity needs and resources, and connect California's robust workforce development system with the needs of California employers.
Center of Cybersecurity and Internet of Things Excellence (CCIoTE) California is home to the personal computer, the firewall, anti-virus and many other cybersecurity products. Today, California companies are at the forefront of new technologies ushering in the Internet of Things (IoT), the term for the phenomenon where people and things are connected to the Internet, leveraging sensors and real time analytics and cloud technologies.
California's leadership role in advanced technology sectors including autonomous vehicles, biotechnology, precision in medicine and advanced manufacturing, will contribute to the State's continued excellence in cybersecurity and privacy. The powerful combination of cyber and the emergence of these innovative intensive sectors make California the perfect place to build secure next generation technologies.
California has a rapidly growing information technology industry cluster and offers the full spectrum of cybersecurity capabilities. Our Golden State has tremendous assets to keep our Country safe, advance innovation with security and privacy built in and be a beacon for other States in our Nation to follow.
Charles "Chuck" Brooks, Vice President, Government Relations and Marketing, Sutherland Global Services
"Adopting a Cooperative Global Cyber Security Framework to Mitigate Cyber Threats (Before it is too Late)" The recent OPM cyber breach at the U.S. Government's Office of Personnel Management (OPM) provided a wakeup call to the seriousness and sophistication of the cyber security threat aimed at both the public and private sectors. The fact is that over 43% of companies had breaches last year (including mega companies such as Home Depot, JPMorgan, and Target). Moreover, the intrusion threats are not diminishing. For example, British Petroleum (BP) faces 50,000 attempts at cyber intrusion every day.
According to the think tank Center for Strategic and International Studies (CSIS), cyber related crime now costs the global economy about $445 billion every year. These cyber security breaches demonstrate that there is a continued need for protocols and enhanced collaboration between government and industry.
In 2014 code vulnerabilities such as Heartbleed, Shellshock, Wirelurker, POODLE and other open source repositories caused chaos and harm. The cyber security community responded to those vulnerabilities with "react and patch." Unfortunately, this means of response has been for the most part, a cosmetic or band aid approach.
The cyber security community's posture must change to one of wait and react to that of being proactive and holistic. It is not really a question of which policies, processes and technologies are ready and best, that will always be debatable. Being proactive means adopting a working Industry and Government Global Cyber Security Framework that would include measures for encryption, authentication, biometrics, analytics, automated network security, and a whole host of other topics related to cyber threats.
LIFARS is a digital forensics and cybersecurity intelligence firm based in New York City. With its history of investigating cybersecurity breaches across a number of industries, LIFARS is uniquely positioned to help increase cybersecurity posture to protect organizations and individuals from real-life hackers and advanced persistent threat actors. By bringing in LIFARS, you can maximize your existing investment into the cybersecurity infrastructure and make sure that your future investments are strategically placed â delivering maximum protection while preserving the productivity of your employees. For these and other reasons, LIFARS was recently ranked as the #2 cybersecurity company in New York Metro area on the Cybersecurity 500 list.
In addition to providing robust security solutions based on best practices and personal hands-on experiences, LIFARS continuously explores the latest innovations in the cybersecurity field and always seeks to find what is shaping tomorrow's industry landscape. In a recent interview with Founder and CTO of LIFARS, Ondrej Krehel, and LIFARS' Digital Forensic Examiner, Paul Kubler, they discussed strategies and policies for cybersecurity in the world today, including common mistakes and how to make them right.
NXT Robotics is a San Diego-based company that designs and builds service robots to support the increasing needs of the hospitality industry. NXT Robotics' service robot platforms are able to provide delivery, security and guest-related services to customers - all while maintaining a consistent and high degree of quality.
The company's founder, Jeff Debrosse has over 20 years of software engineering, cybersecurity R&D and enterprise product management and deployment experience. "This is an exciting time for NXT Robotics," said Jeff. "With access to the CyberTECH community and its resources, our success is further guaranteed."
The company will be providing CyberTECH's incubator and shared workspace offices, CyberHive and iHive, with its own Nixie. "Your team, tenants and guests will find Nixie to be amazingly pleasant to deal with - not to mention, very useful!" Jeff stated.
"NXT Robotics understands the importance of making cyber part of the foundation. We are thrilled to have NXT Robotics join CyberTECH as a member and look forward to working closely with Jeff and his team." said CyberTECH Co-Chair and Founder, Darin Andersen.
We are proud to recognize NXT Robotics as a featured CyberTECH Member for July 2015.
Bird Rock Systems, Featured CyberTECH Member Bird Rock Systems is a company that has been built on a foundation of exceptional customer service, technology and long-term partnership. Bird Rock Systems excels at deploying the latest enterprise class technologies including: security, routing, switching, traffic management, WAN acceleration, wireless, IP communications, storage area networking, performance computing and virtualization. Bird Rock typically begins a new client engagement by completing a network or security assessment. Their many loyal customers represent enterprise business, casino, university, Fortune 500 and government organizations requiring 'Best in Class' secure technical solutions.
iWebGate, Featured CyberTECH Member Founded in Australia in 2004 with global corporate operations in North America, iWebGate has pioneered a new form of virtualization technology - the Virtualization of Network Services. iWebGate's LaunchPad allows organizations to properly and securely segment networks, connectivity and devices, eliminating the need for Firewalls and VPNs as primary security and connectivity solutions. By deploying the iWebGate Workspace Suite, organizations can then integrate security and business applications into the iWebGate LaunchPad transforming them from "enterprise friendly" products into "enterprise ready" solutions. The result is faster, more secure and reliable access to networks and network services.
San Diego Venture Group Selects Fhoosh as a 2015 Cool Company Cybersecurity software development firm FHOOSH, Inc. has been chosen as a "Cool Company" by the San Diego Venture Group (SDVG) for a second year. One of 31 Cool Companies selected this year from over 160 applicants, FHOOSH continues to represent the leading edge of San Diego-area tech innovation.
FHOOSH helps corporations, institutions and government organizations protect and power valuable stored digital information with its cybersecurity platform and productivity software. FHOOSH bankLevel+ cybersecurity safeguards an organization's critical business and customer data from cyber threats by storing it in a state that is useless to hackers. It does this approximately five times faster than storing data unencrypted, with technology that breaks apart, disassociates, separately encrypts, and then disperses the data. The system also quickly notifies network administrators when unauthorized individuals try to access FHOOSH-protected databases, object stores and file systems. FHOOSH implements with existing infrastructure and allows corporate partners to dial in the security, big data/analytics and performance they need. With 15 patents pending, FHOOSH technology has been validated by the foremost cybersecurity response and assessment firm.
Maggey Felix specializes in Marketing and Operations with 5+ years of experience in the technology and cybersecurity industry. Her passion for cybersecurity and cutting-edge technologies is shown through her dedication to helping companies better prepare, organize and market their solutions.
Over the past two years, Maggey has worked closely with the CyberTECH organization to support various marketing and operational activities. Her ongoing effort and commitment to CyberTECH makes Maggey an invaluable member of the community.
We are proud to recognize Maggey Felix as the featured CyberTECH Advisor for July 2015.
Julia Scholl, CyberTECH Director of Marketing and Operations
Julia Scholl is a strategic and forward-thinking public relations and marketing professional with over five years of experience working with non-profit and startup organizations. A capable self-starter with excellent organizational and communication skills, Julia is passionate about building and fostering lasting relationships within the CyberTECH community.
As Marketing and Operations Director, Julia will assist with daily operations, provide membership support as well as ongoing support with events, programs, and all other CyberTECH initiatives.
We are excited to welcome Julia to CyberTECH. Please feel free to contact Julia directly at firstname.lastname@example.org.
Jessica Herrmann, CyberTECH Events Coordinator
Jessica Herrmann has over 20 years of experience applying key leadership, communication and problem solving skills within the hospitality industry. As Catering and Events Manager, Jessica has worked with a number of organizations to develop, manage and execute top quality events.
Jessica recently joined CyberTECH as Events Coordinator where she will help with the planning, organization, preparation and execution of CyberTECH events.
Please join us in formally welcoming Jessica to the CyberTECH community.
Check out their new brand video and youâll see why Webpass isnât just another ISP. They're leaders and innovators on a mission to change the way people think about the Internet. You can contact them at 1-800-Webpass!
Step Inside Webpass! CyberTECH 2015 Newsletter Sponsor Webpass, a leading Internet service provider in the San Diego area is now delivering residential Internet connections at 100, 200, or 500 Mbps and business Internet connections from 10-1000 Mbps. There has never been a better time to cut the cable and switch to Webpass for your Internet needs! As the owner and operator of its Ethernet network, Webpass promises customers a simple urban Internet experience. They distinguish themselves from the competition through the simplicity of set-up, absence of contracts and personable customer service.
Sign up today and instantly browse the Internet without modems, contracts or gimmicks.
Webpass 1360 5th Avenue San Diego, CA 92101 1-800-Webpass
A key CyberTECH operating principle is collaboration. We are always looking to partner with individuals and organizations looking to get involved in various cyber and IoT initiatives throughout the region and across the globe. Opportunitites include event chair, volunteer, champion, program chair and more. For additional information on how you can support CyberTECH, please contact Julia Scholl.
Cleaning big data usually invokes big stress levels. With the advancements weâve seen in technology over the years, many industries have been transformed from the very core. For academic researchers and data scientists, data has become more expansive and detailed than ever before. However, this has also affected business analysts and millions of others in […]
In the ever growing world of big data, the benefits of implementing a strong data quality and data governance program become more apparent. Having poor data quality can cost a company major financial loss, as well as reputation. In numerous surveys, IT directors and program managers agree that poor data quality is a major obstacle […]
Technology products can change fast and Scribeâs integration platform as a service (iPaaS) is no exception.Â Like most cloud services, Scribeâs iPaaS is upgraded on a continual basis with several releases each year.Â Just over the past few months we have introduced an entirely new user experience, added support for new connectors (e.g., big data connectors for Amazon S3Â and Amazon...
More technologies are simultaneously reaching maturity than at any other time in recent memory. Getting the most out of cloud, mobile, big data, IoT, machine learning, artificial intelligence and other maturing technologies will require organizations to open themselves to new ways of thinking.
Automattic, the company behind WordPress, will be closing its San Francisco officeâapparently because very few employees have been choosing to show up for work in person, Quartz reports.
Automattic has long given its blessing to working remotely. Being in the office was always optional, and the company even provides financial support for employees to work from other locations. It offers employees $250 a month to use co-working offices, and if an employee works from a Starbucks, Automattic offers to pay for his or her drink. Now so few people are coming into the office that keeping it open just doesnât seem to be worth the cost anymore.
Tech companies take a wide variety of approaches to employees working remotely. In 2013, Yahoo CEO Marissa Mayer famously banned working from home. Mayer, who received a lot of criticism for the decision, said she didnât want employees working offsite because people in the office were complaining that they rarely interacted with their remote counterparts. She also felt that people could be more collaborative if they were in the same place and cited the development of a weather app in conjunction with Flickr as an example of the benefits of working together in person.
Similarly, in March, IBM, which was an early supporter of working from home, told employees from its U.S. marketing team who work remotely they would have to make it in person to one of six different locations or find a new job. Other departments, including security and procurement, had previously been told to work from offices. This is a surprising change for a company that embraced remote work in the â80s and â90s. It even announced in 2009 that having so many remote employeesâ40 percent of the company, to be exactâsaved it about $100 million annually on office space in the U.S.
Experts are a bit divided on the real-world results of remote working. In 2011, researchers from Harvard Medical School looked into the âwater cooler effectââthe idea that employees can increase productivity by speaking to one another in an informal setting during their breaks.
The study looked at 35,000 biomedical science papers involving 200,000 authors. They found that on average, papers were cited more when the first and last authors were in greater in-person contact. Papers with four or fewer authors in the same building were also cited more than papers whose writers were in separate locations.
On the other hand, Nicholas Bloom, a professor of economics at Stanford, acknowledges that working from home has a negative reputation, but believes that research should disprove that. Bloom himself conducted research in 2014 on working remotely by studying workers from the call center of Ctrip, a Chinese travel website. For the study, workers were allowed to volunteer to work from home for nine months. Bloom found that the group that stayed home was 13 percent more productive, and quit rates among them were cut in half. The frequency with which people quit a job is important because the average firm has a 50 percent turnover rate and companies waste time and money trying to hire new people every year, Bloom said in an interview. âThe key thing to note is anecdotes are great, but big data shows working from home is rising.â
Heâs right. According to a 2017 Gallup poll, 20 percent of U.S. employees work remotely 100 percent of the time. Tiny Pulse conducted a survey of 509 US employees who are permanent remote workers and found that they tend to be happier at work and feel more valued compared to the overall group of workers, though they report having âa lower relationship with their co-workers,â as Tiny Pulse puts it.
It happens to me at least once a week â I want to check progress of some heavy script that runs in chunks over big dataset and find out that it writes intermediate data to temporary table only. Last time it happened 3 days ago when I wanted to analyze...(read more)
Katherine Glasheen has a nickname fit for an engineer: machine, and it is not just because it rhymes with her last name.
A second year aerospace PhD student, she has a drive to advance technology, and is conducting research on socially aware drones, a project that will become increasingly important with wider adoption of UAVs. Today, however, it is something that is future focused enough that even her advisor calls it, "kind of wild."
âThe technology is developing faster than society can handle," Glasheen says. "One drone delivering a package in downtown Denver is challenging enough, but what about when there are hundreds of them? We need systems that are scalable and robust."
Her proposal calls for using internet data to infer local attitudes about drones.
"If the UAV could analyze news articles and comments on websites and knew people in the area it was traveling were uncomfortable with drones, it could deliberately avoid flying over places like schools, hospitals, and parks,â Glasheen says.
"The idea is to create an 'ethical drone' that understands and tries to respect local attitudes. It is a novel way to think about combining unmanned systems, big data, and artificial intelligence. I've not heard anyone suggest it before," Frew says.
Less Artificial, More Intelligence
To get there, Glasheen is first working on improving more conventional trip planning methods.
"For a delivery drone, the path it's following is preplanned and loaded before it takes off. That doesn't account for any variables it could encounter on the way where it would need to change course," Glasheen says.
What kind of variables? Think of the things you encounter driving to the grocery store. As humans, we can quickly react if a driver runs a red light. If we encounter traffic, we can take a different route.
Glasheen wants the drone to be able to change course and make adjustments midflight, but the AI is not the only problem. The kind of computer needed to process that much data is large and heavy, and would quickly turn a flying drone into a grounded paperweight.
Drones in the Cloud
That is where the cloud can come in come in. She sees a future where UAVs can regularly contact cloud systems to relay problems and determine solutions.
"The drone has a small brain, but there's a big brain in the cloud. If the drone could ping the cloud and asks for help, you can get a solution to safely navigate through an environment," Glasheen says.
The technology has great potential for the future. While delivery drones are often discussed as a public use, a UAV that can exchange data with the cloud could improve military reconnaissance and even weather forecasting.
"It's all so exciting. The field is evolving every day and you can see new applications," Glasheen says. "A lot of it still unknown, which makes some people uncomfortable, but for me it's thrilling."
TX-Irving, All applicants must have a minimum of 7+ years industry experience in order to apply. No 3rd party resumes accepted. LOCATION: Dallas, Texas DURATION: 6 months JOB RESPONSIBILITIES: We are doing a large Big Data project for a healthcare client in Dallas We need someone that can look at the data we have loaded now and map it to how we will have to setup and load it in Cask. Should be Cask on Horton
Other than noting back in January that all three(!) of my talk proposals were accepted, I haven't blogged about them since, so the only information about them is on the cf.Objective() web site. The session overviews give a fair sense of what you should get out of each presentation and roughly what they'll cover.
Since I have just now finished the three presentations and got all the code working, I thought I'd write up some thoughts about the talks, to help folks who are on the fence decide 'for' or 'against'.
ORM, NoSQL, and Vietnam plays on blog posts by Ted Neward and Jeff Atwood to put Object-Relational Mapping under the microscope and look at where the mapping breaks down and how it can "leak" into your code, making your life harder. After that I take a quick walk thru the general "No-SQL" space and then focus on document-based data stores as a good (better?) match for OOP and provide examples based on MongoDB and cfmongodb, with a quick look at how common SQL idioms play out in that world.
Humongous MongoDB looks at replica sets, sharding, read preference, write concern, map/reduce and the aggregation framework, to show how MongoDB can scale out to support true "Big Data". The talk will feature a live demo of setting up a replica set and using it from CFML, including coping robustly with failover, and a live demo of setting up a sharded cluster (and using it from CFML) to show how MongoDB handles extremely large data sets in a fairly simple, robust manner.
At the start of each of my talks, I have a "You Might Prefer..." slide listing the alternative talks you can attend if you don't fancy mine after you've seen the agenda slide - I won't be offended!
The slides (and all code) will be available after the conference. I'll post the slides to my presentations page and the code will go up on my Github repository. If any user groups would like me to do remote presentations of these talks later in the year (and record them and post them to Charlie Arehart's UGTV site), just contact me to figure out dates and times.
We all have seen or heard about Big Data in various environments of science and technology such as meteorology, astronomy, physics simulations, demographic studies, Internet usage, and transaction analysis. Apparently, the government, military, and even the scientific community uses Big Data. So, what is so big about Big Data?
Remember the halcyon days of the Dot-Com era? A frothy stock market, venture capital money flowing like water and famous sock puppets characterized the exuberance of the day. One company (Boo.com) spent $188 million in just six months to create an online fashion store. And 16 start-ups spent over $2 million each for a 30 […]
Sarasota, FL, Aug. 08, 2017 (GLOBE NEWSWIRE) -- Zion Market Research has published a new report titled "Hadoop Market by Type (Software, Hardware and Services) for BFSI, Government Sector, IT & ITES, Healthcare, Telecommunication, Retail and Other End-Uses: Global Industry Perspective, Comprehensive Analysis, Size, Share, Growth, Segment, Trends and Forecast, 2016 â 2022". Â According to the report, theÂ global Hadoop marketÂ was valued at approximately USD 7.69 billion in 2016 and is expected to reach approximately USD 87.14 billion by 2022, growing at a CAGR of around 50% between 2017 and 2022.
Hadoop is an open source framework which is designed for storing and processing big data in a distributed environment across clusters of computers. To store and process the unstructured and structured data it uses simple programming models. It is designed to scale up from single servers to many of machines, each offering local computation and storage.
Browse through 29 Market Tables and 32 Figures spread through 110 Pages and in-depth TOC on "Hadoop Market by Type (Software, Hardware and Services) for BFSI, Government Sector, IT & ITES, Healthcare, Telecommunication, Retail and Others End-Uses: Global Industry Perspective, Comprehensive Analysis, Size, Share, Growth, Segment, Trends and Forecast, 2016 â 2022".
Hadoop is useful for scalable storage platform as Hadoop can store and distribute the very large amount of data. Another benefit of Hadoop is cost effective storage solution for businesses. Additionally, Hadoop is flexible as with the help of businesses are enable to easily access new data sources and tap into different types of data like structured and unstructured for generating value from that data. ...
You might already know that IUPUI offers more than 350 undergraduate, graduate and professional programs.And come this fall, there will be a few more.Hereâs a look at seven new academic programs from a variety of schools across campus:Ph.D. in data science, School of Informatics and Computing: This degreeâthe first of its kind in Indiana and in the Big Ten, and one of only a handful in the United Statesâleads to positions in academia as well as in industry. In fact, Glassdoor, a job and employment-recruiting website, ranks data scientist as the No. 1 job in America based on the number of job openings, salary and overall job-satisfaction rating. According to Glassdoor, the median base salary for a data scientist is $116,840.The field of data science involves collection, organization, management and extraction of knowledge and insights from massive, complex, heterogeneous data sets commonly known as "big data."Ph.D. in American studies, School of Liberal Arts: This nontraditional doctoral program looks to recruit students interested in exploring issues through a multidisciplinary approach, drawing on courses already being offered across the School of Liberal Arts. A doctoral internship of at least a year will help students translate their research into a variety of careers."The Ph.D. program in American studies at IUPUI does not tweak the traditional Ph.D. model, but rather builds an infrastructure for a collaborative and applied graduate school experience in order to close the distance between academia and the world that surrounds it," said Raymond Haberski Jr., professor of history and director of American studies.Graduate minor in communicating science, Department of Communication Studies, School of Liberal Arts: Scientists and health professionals today need to connect to and engage with the lay public, policymakers, funders, students and professionals from other disciplines. As a result, they find the need to tailor their communication for a variety of audiences. This programâdesigned for future scientists, including researchers and practitioners, who find themselves increasingly responsible for public speaking and writingâwill increase studentsâ career prospects, help them secure funding and help them serve as effective teachers."The courses will offer more than public speaking and writing tips," said Krista Hoffmann-Longtin, assistant professor of communication studies in the School of Liberal Arts and assistant dean for faculty affairs and professional development in the School of Medicine. "Scientists will learn to improvise messages; to tell relevant stories; and to connect effectively with students, collaborators and funders."Liberal arts and management certificate, School of Liberal Arts: A 2013 study suggests that a liberal arts degree coupled with other skills can nearly double job prospects when those skills include marketing, business, data analysis and managementâjust to name a few."This certificate offers a course of study from both liberal arts and business to better prepare the 21st-century liberal arts graduate to respond to the challenges of a more complex world," said Kristy Sheeler, associate dean for academic programs in the School of Liberal Arts and a professor in the Department of Communication Studies. Contact Sheeler with questions about this new program.Doctor of public health in global health leadership, Richard M. Fairbanks School of Public Health: The school already knows what some students in this new program will do when they graduate: Theyâll become state health commissioners; ministers of health; program officers; and mid- to senior-level managers in government agencies, foundations, nonprofits and nongovernmental organizations.Thatâs based on experiences of a similar program at the University of North Carolina at Chapel Hill. The person who helped design and lead that program is now at IUPUI: Sue Babich, associate dean of global health, director of the doctoral program in global health leadership, and professor of health policy and management.The degree prepares students to be leaders who can address the worldâs challenging and complex public health issues. The three-year degree is a distance program, with classes delivered in real time via internet video. Students meet face-to-face three times each year in years one and two, and they complete dissertations in year three.Master of Science degree in product stewardship, Richard M. Fairbanks School of Public Health: The only academic degree available today designed to prepare students for leadership roles in the emerging field of product stewardship will train professionals to help businesses in a wide range of industrial fields navigate increasingly complex regulations as they advocate for the production of products in ways that ease regulatory compliance, minimize risks to people and the environment, and boost profitability.The online 30-credit-hour degree is expected to attract, among others, professionals who are already active in the product-stewardship field seeking formal training that will allow them to move up in their product-stewardship organizations and professionals from a wide range of other backgrounds, including environmental health, regulatory compliance, industrial hygiene, occupational health and safety, sustainability, product development, supply chain, and law.Master of Arts in teaching English to speakers of other languages (TESOL), Department of English, School of Liberal Arts: This 31-credit-hour degree provides both a strong theoretical foundation and hands-on practical experience to prepare national and international graduate students to become effective teachers of English to adult learners who speak other native languages, both in the United States and abroad.Working with IUPUIâs award-winning faculty, students will experience rich opportunities in teaching practica, including not only English for academic purposes but also English for specific purposesâfor example, academic, legal, business and medical English. The program features a unique curricular strength in second-language research, materials preparation, curriculum design and the use of technology in second-language learning."It is thrilling to be able to launch the Master of Arts in TESOL at IUPUI," said Ulla Connor, director of the program. "This program is the culmination of TESOL and applied linguistics programming in the Department of English at IUPUI over the past 30 years. Our previous programs include the English for Academic Purposes Program for international students, which began in 1985; the International Center for Intercultural Communication, which started in 1998; and the Program for Intensive English that we began in 2015.”
Because, come on, putting your tendentious conclusion right there in the title and disguising it as a question, while an impressively textbook instance of question-begging, in this context is also pretty funny. Because, âHey, weâve already established that Amazon is a monopoly; weâre just here to determine how much of a threat the company poses to Freedom and All That Is Good. Is it an existential threat, like Roger Cohen said about ISIS? Or merely an extremely threatening threat?â
And who knows, maybe theyâll answer the question, âNo,â right? Maybe the panelists will decide that Amazonâs âbook monopolyâ is actually a benefitto freedom of expression, as monopolies often are. Itâs not as though theyâve structured things so that the question answers itself, and I donât know why anyone would suspect this panel might be anything other than a diverse collection of open-minded people honestly engaging in free inquiry and the pursuit of knowledge wherever the facts may lead!
Thanks to the efforts of serious-sounding organizations like New America (and if that vague but happy-sounding name didnât cause your bullshit detector to at least tingle, it shouldâsee also Americans for Prosperity and the Center for American Progress), this âAmazon is a Monopolyâ silliness is so persistent that Joe and I dealt with it in our inaugural post on zombie memesââarguments that just wonât die no matter how many times theyâre massacred by logic and evidence.â Half the purpose of the Zombie Meme series is to save Joe and me from having to repeat ourselves, so if you want to have a laugh about why, despite its persistence, âAmazon is a Monopolyâ is so embarrassingly dumb and misguided, hereâs your link.
But hereâs the amazing part: âAmazon is a monopolyâ is actually the cleverhalf of the eventâs title. The really funny part is what follows: that Amazon poses a threat to freedom of expression!
Given that Amazonâs self-publishing platform enables all authors to publish whatever they like and leaves it to readers to decide what books they themselves find beneficial, while the New York Big Five (no concentrated market power in a group with a name like that!) has historically rejected probably 999 books for every one they deem worthy of reaching the public, a few questions present themselves. Such as:
â¢Who has really been âmanipulating and supervising the sale of books and therefore affecting the exchange of ideas in America,â and who has really âestablished effective control of a medium of communicationââan entity that screens out 99.9% of books, or one that has enabled the publication of any book?
â¢Who has really been running an uncompetitive, controlled, supervised, distorted market for booksâa company dedicated to lower prices, or a group calling itself the Big Five that has been found guilty of conspiracy and price fixing?
â¢Who is really restoring freedom of choice, competition, vitality, diversity, and free expression in the American book marketâan entity that consigns to oblivion 999 books out of a thousand, or one that enables the publication of all of them?
â¢And who is really ensuring that the American people determine for themselves how to take advantage of the new technologies of the 21st Centuryâan entity responsible for zero innovation and dedicated to preserving the position of paper, or one that has popularized a new publishing and reading platform that for the first time offers readers an actual choice of formats?
Think about it. This âNew Americaâ organization has put together a panel dedicated to persuading you that there was more freedom of expression when an incestuous group of five Manhattan-based corporations held the power to disappear 999 books out every thousand written, and indeed performed that disappearance as the groupâs core function (they call this âcurationâ). And that, now that Amazonâs KDP platform has enabled all authors to publish virtually anything they want, freedom of expression is being threatened.
For an organization calling itself âNew America,â these jokers sure seem wedded to the old version.
In fairness to New America, I should note that their worldview is hardly unprecedented. The notion that the traditional way of doing things is ipso facto the best way of doing things was lampooned by Voltaire over 150 years ago through his character Dr. Pangloss, who was convinced (before experience in the world introduced doubts) that âAll is for the best in this best of all possible worlds.â And Pangloss was himself based on the religious philosophy known as theodicyâa word coined over 300 years ago to describe a kind of faith thatâs doubtless as old as the human race (and a word I admit I like because it sounds a bit like âidiocyâ).
In fact, it was as recent as, say, the 1950s that a group of tweed-jacketed, straight white male college professors were genuinely convinced that the collection of books they deemed the most intrinsically worthyâall, coincidentally, written by other straight white malesârepresented the maximally possible amount of valuable expression, information, and ideas. They even called their collection the âcanon,â which I admit did tend to make their subjective choices sound important and even divinely ordained. As people came to question the absence of women and minority writers from this collection selected exclusively by straight white males, I imagine the straight white males genuinely believed that broadening the âcanonâ to include women and minorities was a threat to freedom of expression and all that. This is just the way a lot of people are wired, especially when status and privilege are part of the mix.
And really, you do have to take a moment to applaud the mental gymnastics required of otherwise presumably intelligent people to say shit like âmore authors writing more books reaching more readers is threatening freedom of expression, the flow of information, and the marketplace of ideas.â Itâs War is Peace/Ignorance is Strength/Freedom is Slavery level doublethink. On the one hand, itâs sad, but on the other hand, in all the universe could there be a race as capable as humans of clinging so resolutely to faith in the face of so many contrary facts? Seen in this light, thereâs something tragically beautiful about it.
And while I admit that New Americaâs âday is night, black is whiteâ bizarro worldview isnât easy to parody, I canât resist trying. Soâ¦
Coming up next from New America: The Internetâs Dictatorial Grip: Impeding Access to Information? And The Tyranny of the Cell Phone: Shutting Down Communication? And Our Addiction to Paved Roads: A Threat to Freedom of Movement?
One more thing about this event thatâs unintentionally hilarious, and then I need to get back to something worthwhile (AKA, the new manuscript). Take a look at the guest list. If you hired a team of NASA scientists to design the most rabidly, incestuously anti-Amazon panel possible, this is pretty much the group the team would propose. Though I doubt even the scientists (assuming they had a little dignity) would have gone to far as to bring in Douglas Prestonand his literary agent, Eric Simonoff. I mean, this is getting pretty close to just adding clones of existing panelists and eliminating the last fluttering fig leaf of diversity.
They also have the dean of the Amazon Derangement crowd, Scott Turow. And Franklin Foer, who in fairness should be disqualified from even being on this panel because of his claimâin his much-derided âLet us kneel down before Amazonâ screedâthat âThat term [monopoly] doesnât get tossed around much these days, but it shouldâ!
By the way, I wouldnât be surprised if Foer makes the same cringe-worthy claim again, on this very âAmazon is a Monopolyâ panel. The anti-Amazon crowd has never been particularly educable.
Also present will be Mark Coker, the head of Smashwords, an Amazon competitor. And author Susan Cheever, a member of Authors United, an organization that represents pretty much the platonic ideal of Amazon Derangement Syndrome. A couple of anti-trust lawyers to provide a veneer of legal gravitas (and to troll for clients, no doubt). And a second-year law student named Lina Khan who has argued that Amazon âshould alarm us.â
And thatâs it. Thatâs as diverse and wide-ranging as the lineup gets. The full gamut of viewpoints, from Aâ¦all the way to B.
Although really, even that feels a little generous.
Oh, by the way, Eric Schmidt, Executive Chairman of Google, another Amazon competitor, is the chairman of New Americaâs board of directors, too. No conflict of interest there. Nothing to disclose to anyone who might think this is some sort of disinterested, scholarly event.
So yeah, itâs really that much of a hive-mind lineup. But thatâs not even the best part. The best part is, this remarkably insular and incestuous exercise in groupthink has been assembled to speak out against a purported threat toâ¦freedom of expression! The flow of information! And the marketplace of ideas!
None of this is an accident, by the way. It isnât just stupidity and incompetence. Thereâs a reason organizations will try to take a narrow outlook and propagate it through multiple mouthpieces: doing so can create the impression that a rare and radical notion is in fact widely heldâheld even by ostensibly disparate groupsâand therefore more trustworthy. Indeed, this form of propaganda is a favorite of some of the same reactionary groups New America is showcasing on its panel. As I said recently about the supposedly âunprecedented joint actionâ of some booksellers, authors, and agents complaining together about Amazon:
Which brings us to the second revealing aspect of this âpropaganda masquerading as an interviewâ drill. You see, in the standard âblow-job masquerading as interviewâ gambit, itâs generally enough to hope the reader will just assume the interviewer and interviewee are working at arms-length. Making the point explicitly isnât really the done thing. Here, however, perhaps not trusting readers to be sufficiently gulled, the ABA and AG are at pains to describe the âunprecedented joint actionâ of the AG, Authors United, the ABA, and the Association of Authorsâ Representatives in going after Amazon formonopolizing the marketplace of ideas,devaluing books, and generallycrushing dissent, democracy, and all that is good. The impression theyâre trying to create is, âWow, if so many separate organizations hate Amazon, Amazon must be doing something bad.â
But whatâs critical to understand is thatthe most fundamental purpose of the Authors Guild, Authors United, the American Booksellers Association, and the Association of Authors is to preserve the publishing industry in its current incarnation. Whatever marginal differences they might have (Iâve never actually seen any, but am happy to acknowledge the theoretical possibility) are eclipsed by this commonality of purpose. Under the circumstances, the fact that these four legacy publisher lobbyists agree on something is entirely unremarkable (indeed, what would be remarkable would be some evidence of division). But if people recognize the exercise as a version of âNo really, I read it somewhereâ¦okay, I wrote it down first,â the propaganda fizzles. And thatâs why these propagandists have to nudge readers with the bullshit about the âunprecedented joint action.â Otherwise, when Authors Guild Executive Director Mary Rasenberger cites Authors United pitchman Doug Preston as though Preston were a separate, credible source, people might roll their eyes instead of nodding at the seriousness of it all. They might even giggle at the realization that all those âWhen did Amazon stop beating its wife?â questions were functionally being put by Rasenberger to herself.
So no, this wasnât remotely a cross-examination, or even a cross pollination (indeed, publisher lobbyists are expert atfleeinganything that offers even the slightest whiff of actual debateâwhich does make their alleged devotion to the Free Flow of Ideas and Information as the Engine of Democracy worthy of a smile, at least, if nothing else). It was just a stump speech lovingly hosted by someone elseâs blog. The sole reason for the exercise was to create the misleading appearance of multiple, arms-length actors when functionally there is only one.
In fairness to the aforementioned Unprecedentedly Joint Actors, there is arich heritagebehindthis form of propaganda. For example, in the run-up to Americaâs second Iraq war, Dick Cheney would have someone from his office phone up a couple of pet New York Timesreporters, who would then dutifully report that anonymous administration officials believed Saddam Hussein had acquired aluminum tubes as part of his nuclear weapons effortsâ¦and then Cheney would go on all the Sunday morning talk shows and get to say, âDonât take my word for the aluminum tube stuffâeven the New York Times is reporting it!â
So leave aside the fact that the âjoint actionâ in question isanything but unprecedentedâthat it is in factpublishing establishment SOP. Anyone familiar with the record of these organizations will instantly realize that the âunprecedented joint actionâ in question is a lot like the âjoint actionâ of all four fingersâplus the thumb!âof someone throwing back a shot of tequila. Like that of a little boy pleasuring himselfâwith both hands!âand trying to convince anyone who will listen that the Unprecedented Left and Right Action is proof that âEverybody loves me!â
Okay, I apologize for the multiple excerpts from previous posts. But what are you going to do? These bloviators keep vomiting up the same tired bullshit, no matter how many times itâs debunked. It just saves time to refer to the previous debunkings rather than typing it all out again.
My advice to New America? If youâre more than just a propaganda operationâif you really do care about freedom of expression, and the flow of information, and the marketplace of ideasâyou might want to add at least a token panelist with a viewpoint that differs even just a tiny bit from that of the nine Borg youâve assembled to intone that Amazon Is Evil and Will Destroy All That Is Good. Otherwise, your event is going to feel more like a circle jerk and less like sex. And, doubtless, with similarly productive results.
Joe sez: And just when I think Iâm outâ¦
Thanks, Barry, for turning a spotlight on this silliness, and patiently picking apart why it is so silly. Iâm sure the panel will be a resounding success, much like all circle jerks and echo chambers are for those involved. Masturbation is supposed to be satisfying, and a nice âatta boy!â and backslap at the finish seems preferable to eating the soggy biscuit.
Donât Google that if you donât know what I mean. You canât unlearn it.
One of the reasons Iâve largely eschewed activism lately is because I havenât seen any ill effects from all the Amazon bashing being done by the usual spin doctoring suspects.
At the risk of invoking Godwinâs Law, the propaganda classic Triumph of the Will was just released on BluRay for the first time. Itâs an effective piece of filmmaking, and Frank Capra imitated a lot of elements from it for his Why We Fight series.
But I donât think this approach works when it comes to Amazon. People arenât so ready to buy what the pinheads are selling. Today we can have the New York Times, which I believe still has the motto âAll the news thatâs fit to printâ, show such stunning anti-Amazon bias that the public editor has called it out more than once, and the public simply doesnât give a shit. Amazon still gets their approval and their business, no matter how many times David Streitfeld one-finger-types his screeds while busting out knuckle babies with his other hand.
The public likes Amazon. Even if it were true that Amazon is planning to overthrow the government and replace the Bill of Rights with a guarantee of same day free shipping, its approval rating is so high that I donât think most folks would care.
But for all the alarmist rhetoric and soothsaying predictions of world domination, Iâve yet to see anyone other than Big 5 apologists and their NY media cronies show much concern over Amazonâs mounting dominance of online retail.
Maybe thatâs becauseâwild guess hereâAmazon offers authors unprecedented opportunity to reach readers, and offers readers the widest selection at the lowest possible prices coupled with good customer service.
Authors United, and the NYT, are doing everything theyâre supposed to be doing to spread their anti-Zon propaganda, but the people donât care.
If I had faith in human nature, Iâd posit that access to the Internet (and the ability for anyone with second grade spelling skills to type words into a search engine) can reveal in a click or two what utter nonsense the morons are spouting.
But I think the more realistic answer is that people simply like Amazon because it has a wide selection, low prices, and good customer service.
So I no longer feel the need to correct the greedy, self-interested 1% of authors who want to prop up an archaic, inefficient, and ruthless publishing industry with stupid organizations and articles and events. Joe Average might very well read about this panel in a Streitfeld spat of âjournalismâ, cluck his tongue at how Amazon is destroying freedom of expression, and then quickly forget about it when the UPS guy knocks on the door with a box of Bounty because yesterday Joe used his Amazon Dash button to order more.
The legacy publishing industry is dying. Once it lost its lock on distribution, it lost the majority of its power. The only ones who will mourn that industry are the few handfuls of authors it made rich. And when their corporate masters merge and downsize into inevitable bankruptcy, watch how quickly they jump on Amazonâs teat when the seven figure advances are gone.
But, for old timesâ sake, let me fisk New Americanâs event description. Their nonsense in italics, my replies in regular font.
Amazon dominates the U.S. book market to a degree never before seen in America.
But does it dominate the U.S. book market to a degree never before seen in Canada?
Okay, Iâm making fun of the lousy sentence, but isnât that like saying âIn my house I dusted the bookcases to a degree never before seen in my house?â
That's silly. Especially since I switched to ebooks and got rid of my bookcases.
This corporation dominates every key segment of the market.
Wow, that's a lot of dominance. I hope the public has a safeword.
And this immense size gives Amazon unprecedented power to manipulate the flow of books â hence of information and ideas â between author and reader.
OK, reread what Barry and I have written here. For over a hundred years, publishers have refused to publish the overwhelming majority of books, essentially preventing the public from ever reading them. They had a right to do that, just like Chick-Fil-A has a right to be closed on Sundays for ridiculous religious reasons.
But unlike the Big 6, or Chick-Fil-A, Amazon is allowing more traffic than ever before. More books are flowing with Amazon than flowed with the Big 6.
Plus, Amazon isnât a monopoly, and doesnât control the Internet, so if there were cases where Amazon decides it doesnât want to sell something, it canât prevent it from being sold elsewhere.
Last summer a group of authors made the case that Amazonâs actions constitute an abuse of its monopoly powers and threatens this vital marketplace of ideas.
It was a shitty case. But letâs not allow facts to get in the way of good propaganda. Because if you keep repeating the same lie, some people are bound to start believing it.
Amazonâs actions, they wrote, may already be affecting what authors write and say.
As evidenced by Amazon refusing to sell any work by any signatory of Authors United.
But look how Amazon has forced writers to cower in the shadows, fearful of offering any sort of critique.
Hmm. Doesnât a panel about Amazon restricting freedom of expression prove that Amazon canât restrict freedom of expression? Or if it can, doesnât want to?
Oops, my bad. They used the word "may". So it could read "may already be affecting what authors write and say, even though there is no evidence or logic to support that conclusion." Like someday I "may" own my own country, which I'll name Joetopia and make our main export beer parties. If you'd like Joetopia to export a beer party to you, let me know because it "may" happen. Wait by the phone until you hear back.
The authors strongly urged antitrust regulators to take action, in what would be the most important antitrust case since Microsoft in the late 1990s.
Barry and I take a lot of time to add these links to prove out points. You diligent readers are clicking on them, right?
Join New Americaâs Open Markets program for a discussion of Amazonâs monopoly over books and what it means for American readers and Americaâs democracy.
For Godâs sake, someone think of the children! Because an online retailer is all that stands between the freedom to vote for representatives in government (that's the definition of democracy), and a zombie world where neighbors feast on neighbors and the only law comes from the business end of a twelve gauge. Because that argument makes as much sense as theirs.
Some of the nationâs best-known authors will discuss their personal experiences with Amazon.
And nary a one with a contrary point of view! Perhaps because they couldn't find any author with a good personal experience with Amazon. I mean, other than a hundred thousand or four. But I'm sure New America has much better things to do with their time than a little research.
Antitrust lawyers and experts in Big Data and price discrimination will then discuss the larger effects of the corporationâs behavior, and whether the government should bring a case against Amazon.
With Data so Big itâs Capitalized! Did that become a thing and I missed it?
And what could they possibly say in regard to price discrimination? Amazon fights to keep prices low. The Big 6 fight to keep them high. They illegally collude to keep them high. They print the prices on their damn books to keep them high.
Could they be going into the nefarious business practice of co-op, and Amazon charging publishers for better visibility? Is that the discrimination they mean? Or maybe loss leads?
Last I checked, both were not only legal, but commonplace in retailers.
I wonder what the antitrust lawyers will say about Amazon allowing anyone to sell through Amazon. In other words, if Amazon decided it no longer wanted to sell Big 6 titles, I could open up an Amazon seller account and sell Big 6 titles on Amazon. Can someone explain to me how that limits the flow of books between reader and author?
Follow the discussion online using #BookMonopoly and follow us @NewAmerica.
No thanks. But here's a hashtag you can follow: #StoptheStupid.
Lunch will be provided.
And it will be the only substantive thing offered that afternoon.
Now Iâm going back to my WIP. When the NYT write-up of this stupid event runs, Iâm going to ignore it.
From deities to data - "For thousands of years humans believed that authority came from the gods. Then, during the modern era, humanism gradually shifted authority from deities to people... Now, a fresh shift is taking place. Just as divine authority was legitimised by religious mythologies, and human authority was legitimised by humanist ideologies, so high-tech gurus and Silicon Valley prophets are creating a new universal narrative that legitimises the authority of algorithms and Big Data." Privileging the right of information to circulate freely - "There's an emerging market called Dataism, which venerates neither gods nor man - it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system... Like capitalism, Dataism too began as a neutral scientific theory, but is now mutating into a religion that claims to determine right and wrong... Just as capitalists believe that all good things depend on economic growth, so Dataists believe all good things - including economic growth - depend on the freedom of information."
Our unparalleled ability to control the world around us is turning us into something new - "We have achieved these triumphs by building ever more complex networks that treat human beings as units of information. Evolutionary science teaches us that, in one sense, we are nothing but data-processing machines: we too are algorithms. By manipulating the data we can exercise mastery over our fate."
Planet of the apps - "Many of the themes of his first book are reprised: the importance of the cognitive revolution and the power of collaboration in speeding the ascent of Man; the essential power of myths â such as religion and money â in sustaining our civilisations; and the inexcusable brutality with which our species treats other animals. But having run out of history to write about, Harari is forced to turn his face to the future... 'Forget economic growth, social reforms and political revolutions: in order to raise global happiness levels, we need to manipulate human biochemistry'... For the moment, the rise of populism, the rickety architecture of the European Union, the turmoil in the Middle East and the competing claims on the South China Sea will consume most politicians' attention. But at some time soon, our societies will collectively need to learn far more about these fast-developing technologies and think far more deeply about their potential use."
Each technological age seems to have a "natural" system of government that's the most stable and common... Anyway, now we've entered a new technological age: the information age. What is the "natural" system of government for this age?
An increasing number of countries now seem to be opting for a new sort of illiberal government - the style of Putin and the CCP. This new thing - call it Putinism - combines capitalism, a "deep state" of government surveillance, and social/cultural fragmentation.
It's obviously way too early to tell, but there's an argument to be made that Putinism is the natural system of government now. New technology fragments the media, causing people to rally to sub-national identity groups instead of to the nation-state.
The Putinist "deep state" commands the heights of power with universal surveillance, and allies with some rent-collecting corporations. Meanwhile, IF automation decreases labor's share of income and makes infantry obsolete, the worker/soldier class becomes less valuable.
"People power" becomes weak because governments can suppress any rebellion with drones, surveillance, and other expensive weaponry. Workers can strike, but - huge hypothetical assumption alert! - they'll just be replaced, their bargaining power low due to automation.
In sum: Powerful authoritarian governments, fragmented society, capitalism, "Hybrid warfare", and far less liberty.
The Totalitarian - "Putinist models seem to curtail personal freedom and self-expression. Chases away innovation class. In the long run this makes them unable to keep up with more innovative, open societies. But innovative open societies are also fissiparous in the long run. They need a strong centralized, even authoritarian, core. To wit the big democracies also have deep states, just ones that infringe on domestic public life less than Putinist do. Automation makes mass citizenry superfluous as soldiers, workers or taxpayers. The insiders' club is ever-shrinking. Steady state of AI era is grim. One demigod and 10 billion corpses/brain-in-jars depending on humanism quotient of the one. The three pillars for this end state are strong AI, mind uploading/replication, and mature molecular nanotechnology."
So what might take its place? One possibility[:] ... a global plutocracy and so in effect the end of national democracies. As in the Roman empire, the forms of republics might endure but the reality would be gone.
An opposite alternative would be the rise of illiberal democracies or outright plebiscitary dictatorships... [like] Russia and Turkey. Controlled national capitalism would then replace global capitalism. Something rather like that happened in the 1930s. It is not hard to identify western politicians who would love to go in exactly this direction.
Meanwhile, those of us who wish to preserve both liberal democracy and global capitalism must confront serious questions. One is whether it makes sense to promote further international agreements that tightly constrain national regulatory discretion in the interests of existing corporations... Above all... economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable. If we fail to do this, the basis of our political order seems likely to founder. That would be good for no one. The marriage of liberal democracy with capitalism needs some nurturing. It must not be taken for granted.
"Growth drivers from the previous round of technological progress are fading while a new technological and industrial revolution has yet to gain momentum," Mr Xi said at the start of the G20, adding that the global economy was at a "critical juncture".
"Here at the G20 we will continue to pursue an agenda of inclusive and sustainable growth," Mr Obama said, acknowledging that "the international order is under strain".
Mr Xi, whose country has arguably benefited more than any other from globalisation, struck a similarly cautious note in a weekend speech to business leaders. In China, he said, "we will make the pie bigger and make sure people get a fairer share of it".
He also recognised global inequity, noting that the global gini coefficient â the standard measure of inequality â had raced past what he called its "alarm level" of 0.6 and now stood at 0.7. "We need to build a more inclusive world economy," Mr Xi said.
G20 leaders urged to 'civilise capitalism' - "Chinese president Xi Jinping helped set the tone of this year's G20 meeting in a weekend address to business executives. 'Development is for the people, it should be pursued by the people and its outcomes should be shared by the people', Mr Xi said... Before the two-day meeting, the US government argued that a 'public bandwagon' was growing to ditch austerity in favour of fiscal policy support. 'Maybe the Germans are not absolutely cheering for it but there is a growing awareness that 'fiscal space' has to be used to a much greater extent', agreed Ãngel GurrÃa, secretary-general of the Organisation for Economic Cooperation and Development."
The rise of intelligent machines is a moment in history. It will change many things, including our economy. But their potential is clear: they will make it possible for human beings to live far better lives. Whether they end up doing so depends on how the gains are produced and distributed. It is possible that the ultimate result will be a tiny minority of huge winners and a vast number of losers. But such an outcome would be a choice not a destiny. A form of techno-feudalism is unnecessary. Above all, technology itself does not dictate the outcomes. Economic and political institutions do. If the ones we have do not give the results we want, we must change them.
From the Job Loop to the Knowledge Loop (via Universal Basic Income) - "We work so we can buy stuff. The more we work, the more we can buy. And the more is available to buy, the more of an incentive there is to work. We have been led to believe that one cannot exist without the other. At the macro level we are obsessed with growth (or lack thereof) in consumption and employment. At the individual level we spend the bulk of our time awake working and much of the rest of it consuming."
I see it differently. The real lack of imagination is to think that we must be stuck in the job loop simply because we have been in it for a century and a half. This is to confuse the existing system with humanity's purpose.
Labor is not what humans are here for. Instead of the job loop we should be spending more of our time and attention in the knowledge loop [learn->create->share]... if we do not continue to generate knowledge we will all suffer a fate similar to previous human societies that have gone nearly extinct, such as the Easter Islanders. There are tremendous threats, eg climate change and infectious disease, and opportunities, eg machine learning and individualized medicine, ahead of us. Generating more knowledge is how we defend against the threats and seize the opportunities.
There is much hype around "big data" these days - how it's going to change the world - which is causing data scientists to get excited about big data analytics, and technologists to scramble to understand how they employ scalable, distributed databases and compute clusters to store and process all this data.
Interestingly, Gartner dropped "big [...]
Datos IOâs RecoverX platform solves operational challenges and reduces total cost of ownership for Maxwellâs SaaS based platform on MongoDB databases deployed on Amazon AWS public cloud SAN JOSE, Calif., Aug. 10, 2017 — /BackupReview.info/ — Datos IO, the application centric cloud data management company, today announced that Maxwell Health, an HR and benefits technology [...]
The presentation the CUBRID team presented at Russian HighLoad++ Conference in October, 2012. The presentation covers the topic of Big Data management through Database Sharding. CUBRID open source RDBMS provides native support for Sharding with load balancing, connection pooling, and auto fail-over features.
Griffiths, G. , Haines, K. , Blower, J. , Lewis, J. and Lin, N. (2014) The Environmental Data Abstraction Library (EDAL): a modular approach to processing and visualising large environmental data. In: 2014 conference on Big Data from Space (BiDSâ14), 12-14 November 2014, Frascati, pp. 97-100.
Robotics Professor Illah Nourbakhsh leads a discussion on Asia’s Industrialization using visualizations created by his CREATE Lab from Landsat imagery in 2015 at the World Economic Forum's Annual Meeting of the New Champions.
Carnegie Mellon University researchers and scientists will play an important role in global discussions at the World Economic Forum's Annual Meeting of the New Champions, June 27-29, in Dalian, China.
Often called "Summer Davos," to differentiate it from the forum's annual winter meeting in Switzerland, the meeting brings together world leaders in business science, technology, innovation and politics. This year's theme is "Achieving Inclusive Growth in the Fourth Industrial Revolution."
CMU experts have since 2011 led conversations at the World Economic Forum in fields ranging from robotics to artificial intelligence. CMU scientists often lead discussions, give talks, demonstrate technology and provide their distinctive expertise.
Tom Mitchell, the E. Fredkin University Professor in the Machine Learning Department;
Illah Nourbakhsh, professor of robotics; and
Gabriel O'Donnell, principal research programmer and analyst in the Robotics Institute.
CMU will host a panel discussion called "The Future of Production with Carnegie Mellon University," in which Fuchs, Gannon and McCann will discuss rethinking behavior and purpose of industrial robots beyond factory floors, reimagining how large companies can integrate disruption themselves, and reconfiguring how automation collides with human skills.
Nourbakhsh and O'Donnell will make multiple presentations at the Global Situation Space exhibition. The presentations combine NASA time-lapse satellite imagery and geospatial and econometric data with predictive modelling to explore issues such as emerging megacities, man-made changes to the oceans and trade with China.
Nourbakhsh's CREATE Lab and its spinoff BirdBrain Technologies will be part of a workshop on building interactive sculptural robots. He will contribute to sessions on the fourth industrial revolution, the digital economy, the creative economy and platforms for artificial intelligence.
Mitchell will participate in a panel discussion about how the social safety net can respond to the fourth industrial revolution. He recently co-chaired a study of the future work for the National Academies of Sciences, Engineering and Medicine. He will present a session on how big data can affect policymaking.
Madelyn Gannon works with industrial robots and is working to invent better ways to communicate with machines.
Gannon was one of 20 researchers selected to the World Economic Forum's Cultural Leaders advisory community. As part of the programming, she will be participating in sessions that discuss the impact of human-centered robotics on the future of work.
Fifty-two scientists under the age of 40 are recognized this year for exhibiting exceptional creativity, thought leadership and high growth potential, and will be at the Dalian conference.
CMU is one of only 27 universities in the world, 12 in the U.S., that make up the Global University Leaders Forum (GULF), which provides a unique platform for the world's top universities to discuss higher education and research while helping to shape the World Economic Forum agenda. GULF fosters discussion on global policy issues between member universities, the business community and a broad range of stakeholders.
Fonte: Tableau Software O ano de 2016 foi um marco para o Big Data, com mais organizaÃ§Ãµes armazenando, processando e extraindo valor de dados de todos os formatos e tamanhos. Em 2017, os sistemas que oferecem suporte a grandes volumes de dados estruturados e nÃ£o estruturados continuarÃ£o crescendo. HaverÃ¡ uma demanda de mercado por plataformas […]
Thousands of New Customers Improve Business and Gain Insight with Oracle Analytics Cloud
Oracle delivers industryâs most comprehensive cloud analytics platform featuring new self-learning analytics
Redwood Shores, Calif.—Aug 10, 2017
Connecting people to the information they need through the power of cloud technology, Oracle today announced that Oracle Analytics Cloud is experiencing significant growth with thousands of organizations subscribing to the service globally. In addition to tripling adoption over the last 12 months, nearly 75 percent of customers are new to Oracle Analytics Cloud, ranging from small and medium sized businesses accessing enterprise-class analytics for the first time to large organizations modernizing their analytics platforms. Arlington Orthopedic Associates, Outfront Media, and Skanska AB are among those using Oracle Analytics Cloud to identify new savings, help increase their return on investment, and fuel innovation.
In response to this rapid growth, Oracle released a new version of Oracle Analytics Cloud earlier this year, extending its breadth and depth with new capabilities such as user-driven scenario modeling, next-generation mobile and social analytics, and complete customer control over their cloud environment.
“Oracle Analytics Cloud makes it easy for customers to gain new insights and reap the rewards of digital transformation by offering the speed, scale, power, and flexibility organizations need in a single platform,” said Rich Clayton, vice president of analytics product strategy, Oracle. “Customers clearly understand the value, which is driving strong growth across the board – in our base, with new customers, and in utilization, which is the highest of any Oracle Platform as a Service offering.”
Oracle Delivers Most Comprehensive Cloud Analytics Platform
Oracle Analytics Cloud provides the industry’s most comprehensive cloud analytics in a single platform, including everything from self-service visualization and powerful inline data preparation to enterprise reporting, advanced analytics, and self-learning analytics that deliver proactive insights. With support for more than 50 data sources and an extensible, open framework, Oracle Analytics Cloud gives customers a complete, connected, collaborative platform that brings the power of data and analytics to every process, interaction, and decision.
John Cronin, Group CIO at An Post, explained that Ireland’s postal service chose Oracle Analytics Cloud “to extend and integrate into our existing big data analytics. This modern, agile, platform has enabled us to readily externalize our existing analytics and share insights with key customers.”
As part of this release, Oracle introduced an innovative new service, Oracle Analytics Cloud Day by Day. It is the first enterprise analytic application delivering proactive analytics to mobile devices based on business updates and personal preferences, ensuring the right information is always available, without customers even having to ask for it. Oracle Analytics Cloud Day by Day is complemented by a native mobile application, Oracle Analytics Cloud Synopsis, which enables anyone to visually analyze files on their mobile devices and then combine those insights with business information in Oracle Analytics Cloud Day by Day. The Oracle Analytics Cloud Synopsis app is available for free to all mobile users from the App Store for iPhone and iPad, and Google Play™ Store.
“One of our goals is to help our customers take advantage of cloud analytics,” said Francisco Tisiot, principal consultant at Rittman Mead. “Oracle Analytics Cloud provides complete and elastic business intelligence, and is customizable and manageable by customers, all in the Oracle Cloud.”
The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.
Oracle Significantly Expands Cloud at Customer with PaaS and SaaS Services to Help Customers in their Journey to the Cloud
Delivers unrivaled enterprise-grade public cloud SaaS, PaaS, and IaaS services in customersâ datacenters
Redwood Shores, Calif.—Jul 19, 2017
Empowering organizations to move workloads to the cloud while keeping their data on their own premises, Oracle today announced significant expansion of the breadth of services available through Oracle Cloud at Customer. The portfolio now spans all of the major Oracle PaaS categories and for the first time, also features Oracle SaaS services. Since its introduction just over a year ago, Oracle Cloud at Customer has experienced unprecedented growth with leading global organizations across six continents and more than 30 countries adopting the solution, including AT&T and Bank of America.
Oracle Cloud at Customer is designed to enable organizations to remove one of the biggest obstacles to cloud adoptionâdata privacy concerns related to where the data is stored. While organizations are eager to move their enterprise workloads to the public cloud, many have been constrained by business, legislative and regulatory requirements that have prevented them from being able to adopt the technology. These first-of-a-kind services provide organizations with choice in where their data and applications reside and a natural path to easily move business critical applications eventually to the public cloud.
âOracle Cloud at Customer is a direct response to the remaining barriers to cloud adoption and turning those obstacles into opportunities by letting customers choose the location of their cloud services,â said Thomas Kurian, president, product development, Oracle. âWe are providing a unique service that enables our customers to leverage Oracle Cloud services, including SaaS, PaaS, and IaaS, both on their premises and in our cloud. Customers gain all the benefits of Oracleâs robust cloud offerings, in their own datacenters, all managed and supported by Oracle.â
Underpinning Oracle Cloud at Customer is a modern cloud infrastructure platform based on converged Oracle hardware, software-defined storage and networking and a first class IaaS abstraction. Oracle fully manages and maintains the infrastructure at customersâ premises so that customers can focus on using the IaaS, PaaS and SaaS services. This is the same cloud infrastructure platform that powers the Oracle Cloud globally.
Based on overwhelming customer demand, Oracle continues to expand the services available via Oracle Cloud at Customer. With todayâs news, customers now have access to all of Oracleâs major PaaS categories, including Database, Application Development, Analytics, Big Data, Application and Data Integration, and Identity Management. These services take advantage of specific enhancements that have been made to the underlying Oracle Cloud at Customer platform such as servers with faster CPUs and NVMe-based flash storage, as well as all-flash block storage to deliver even better performance for enterprise workloads.
For the first time, Oracle has also made available via Oracle Cloud at Customer, the ability to consume Oracle SaaS services such as Enterprise Resource Planning, Human Capital Management, Customer Relationship Management, and Supply Chain Management in their own datacenters. These best-in-class, modern applications help unlock business value and increase performance by enabling businesses and people to be more informed, connected, productive, and engaged. Major organizations are already adopting this new option to modernize their key enterprise operations and benefit from the speed of innovation in Oracle SaaS without having to move sensitive application data outside their premises. With the addition of SaaS services to Oracle Cloud at Customer, customers have access to Oracle Cloud services across the entire cloud stack, all delivered in a subscription-based, managed model, directly in their datacenters.
Also, newly available is the Oracle Big Data Cloud Machine, which is an optimized system delivering a production-grade Hadoop and Spark platform with the power of dedicated nodes and the flexibility and simplicity of a cloud offering. Organizations can now access a full range of Hadoop, Spark, and analytics tools on a simple subscription model in their own data centers.
Oracle Cloud at Customer delivers the following Oracle Cloud services:
Infrastructure: Provides elastic compute, containers, elastic block storage, object storage, virtual networking, and identity management to enable portability of Oracle and non-Oracle workloads into the cloud.
Data Management: Enables customers to use the number one database to manage data infrastructure in the cloud with the Oracle Database Cloud, including Oracle Database Exadata Cloud for extreme performance and Oracle MySQL Cloud.
Big Data and Analytics: Empowers an entire organization to use a single platform to take advantage of any data to drive insights. Includes a broad set of big data cloud services, including Oracle Big Data Cloud Service, Oracle Analytics Cloud, and Oracle Event Hub Cloud.
Application Development: Enables organizations to develop and deploy Java applications in the cloud using Oracle Java Cloud, Oracle Application Container Cloud, Oracle Container Cloud, and Oracle WebCenter Portal Cloud.
Enterprise Integration: Simplifies integration of on-premises applications to cloud applications, as well as cloud application to cloud application integration using Oracle Integration Cloud, Oracle SOA Cloud, Oracle Data Integrator Cloud, Oracle GoldenGate Cloud, Oracle Managed File Transfer Cloud, and Oracle Internet of Things Cloud.
Security: Enables organizations to use Oracle Identity Cloud to implement and manage consistent identity and access management policies.
Software-as-a-Service: Provides organizations with a complete suite of software to run their businesses, including Oracle ERP Cloud, Oracle CX Cloud, Oracle HCM Cloud, and Oracle Supply Chain Management Cloud.
Customer Demand Drives Expansion of Portfolio
Global organizations are turning to Oracle Cloud at Customer to standardize on a platform to modernize existing infrastructure and develop innovative new applications. Customers including City of Las Vegas, Federacion Colombiana de Municipios, Glintt Healthcare, HCPA, NEC, NTT DATA, Rakuten Card, State University of New York, and State Bank of India are benefitting from Oracle Cloud services from inside their own datacenters.
âThe City of Las Vegas is shifting its Oracle application workloads to the Oracle Cloud,â said Michael Sherwood, Director Information Technologies, city of Las Vegas. âBy keeping the data in our data center, we retain full control while enabling innovation, gaining efficiencies and building applications to better serve our community.â
âToday, public organizations are constantly innovating to meet the needs of our citizens. For the Colombian Federation of Municipalities, we have decided to digitally transform our territories to become smart cities,â said Alejandro Murillo, CIO of the Colombian Federation of Municipalities. âWith Oracle Cloud at Customer, we have the technological capabilities to bring top-level solutions in the cloud to our municipalities, enabling them to operate with more agility and better serve our citizens.â
âOracle Cloud at Customer provides us with a consolidated solution to make sensitive healthcare data securely available,â said Nuno Vasco Lopes, CEO, Glintt Healthcare Solutions. âThe efficient and flexible solution has reduced the total cost of ownership by 18 percent and delivered high customer performance.â
Oracle Cloud at Customer
The Oracle Cloud at Customer portfolio of services enables organizations to get all of the benefits of Oracleâs public cloud services in their datacenters. The business model is just like a public cloud subscription; the hardware and software platform is the same; Oracle experts monitor and manage the infrastructure; and the same tools used in Oracleâs public cloud are used to provision resources on the Oracle Cloud at Customer services. This is the only offering from a major public cloud vendor that delivers a stack that is 100 percent compatible with the public cloud but available on-premises, ensuring that customers get the same experience and the latest innovations and benefits using it in their datacenters as in the public cloud.
The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.
The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation.
The Futurology team speaks to The Better Trading Company Managing Director, Stephen Wormald, about how they are changing rural farming through big data. They chat about how this impacts food stability and the future sustainability of farmers.
We recently returned from another great experience at theÂ High Performance Computing Linux for Wall StreetÂ event in New York on April 7, 2014. This yearâs 11th annual HPC conference focused on big data, HPC applications, data centers fabrics, cloud economics, low latency and how these technologies are all changing the way global financial markets are evolving. […]
Three years ago, CSC predicted that by 2020 data production will be 44 times greater than it was in 2009. Zetabytes (thatâs one billionterabytes) of information, residing online and on internal databases, has become both a huge opportunity and a terrifying information overload for many companies. Both private and public [...]
Initially big data seemed to be something only available to the biggest businesses. Analytics are now being built into almost every application, however, making the technology accessible to businesses of all sizes. As businesses realize the power of information to create successful marketing campaigns and see real-time results, data is [...]
Unless you live under a rock, youâve seen the buzz about Data Lakes, Big Data, Data Mining, Cloud-tech, and Machine Learning. I watch and read reports from two perspectives: technical and as a consultant.
As a Consultant
If you watch CNBC, you wonât hear discussions about ETL Incremental Load or Slowly Changing Dimensions Design Patterns. You will hear them using words like âcloudâ and âbig data,â though. That means people who watch and respect the people on CNBC are going to hire consultants who are knowledgeable about cloud technology and Big Data.
As an Engineer
I started working with computers in 1975. Since that time, I believe Iâve witnessed about one major paradigm shift per decade. I believe I am now witnessing two at the same time: 1) A revolution in Machine Learning and all the things it touches (which includes Big Data and Data Lakes); and 2) the Cloud. These two are combining in some very interesting ways. Data Lakes and Big Data appliances and systems are the sources for many systems, Machine Learning and Data Mining solutions are but a couple of their consumers. At the same time, much of this technology and storage is either migrating to the Cloud, or is being built there (and in some cases, only there). But all of this awesome technology depends on somethingâ¦
In order for Machine Learning or Data Mining to work, there has to be data in the Data Lake or in the Big Data appliance or system. Without data, the Data Lake is dry. Without data, thereâs no âBigâ in Big Data. How do these solutions acquire data?
Some of these new systems have access to data locally. But many of them â most, if I may be so bold â require data to be rounded up from myriad sources. Hence my claim that data integration is the foundation for these new solutions.
What is Data Integration and Why is it Important?
Data integration is the collection of data from myriad, disparate sources into a single (or minimal number of) repository (repositories). Itâs âshippingâ the data from where it is to someplace ânearer.â Why is this important? Internet connection speeds are awesome these days. I have â literally â 20,000 times more bandwidth than when I first connected to the internet. But modern internet connection speeds are hundreds-to-millions times slower than networks running inside data centers. Computing power â measured in cycles or flops per second â is certainly required to perform todayâs magic with Machine Learning. But if the servers must wait hours (or longer) for data â instead of milliseconds? The magic happens in slow-motion. In slow-motion, magic doesnât look awesome at all.
Trust me, speed matters.
Data integration is the foundation on which most of these systems depend. Some important questions to consider:
Are you getting the most out of your enterprise data integration?
Could your enterprise benefit from faster access to data â perhaps even near real-time business intelligence?
How can you improve your enterprise data integration solutions?
The next BriefingsDirect global digital business panel discussion explores how the expansion of automated tactical buying for business commerce is impacting global markets, and what's in store next for Latin America.
Weâll specifically examine how âspot buyâ approaches enable companies to make time-sensitive and often mission-critical purchases, even in complex and dynamic settings, like Latin America.
Alvarez: The concept is a few years old, but we've been delivering SAP Ariba Spot Buy for about a year. We began in the US, and over the past 12 months the concept of Spot Buy has progressed because of our customer base. Our customer base has pushed us in a direction that is, quite frankly, even beyond Spot Buy -- and itâs getting into trusted, vetted content.
We are approaching the market with a two-pronged strategy of, yes, we have the breadth of content so that when somebody goes into an SAP Ariba application they can find what they are looking for, but we also now have parameters and controls that allow them to vet that content and to put a filter on it.
Over the last 12 months, we've come a long way. We are live in the US, and with early access in the UK and Germany. We just went live in Australia, and now we are very much looking forward to going live and moving fast into Latin America with MercadoLibre.
Gardner: Spot buying, or tactical buying, is different from strategic or more organized long-term buying. Tell us about this subset of procurement.
Alvarez: SAP Ariba is a 20 year-old company, and its roots are in that rigorous, sourced approach. We do hundreds of billions of dollars through contract catalog on the Ariba Network, but there's a segment -- and we believe it's upward of 15% of spend -- that is spot buy spend. The procurement professional often has no idea what's being bought. And I think there are two approaches to that -- either ignorance is bliss and they are glad that itâs out of their purview, or it also keeps them up at night.
SAP Ariba Spot Buy allows them to have visibility into that spend. By partnering with providers like MercadoLibre, they have content from trusted and vetted sellers to bring to the table â so it's a really nice match for procurement.
Gardner: The trick is to allow for flexibility and being dynamic, but also putting in enough rules and policies so that things donât go off-track.
Alvarez:Exactly. For example, itâs like putting a filter on your kidsâ smartphone. You want them to be able to be liberated so they can go and do as they please with phone calls -- but not to go off the guardrails.
Gardner: Karen, tell us about MercadoLibre and why Latin America might be a really interesting market for this type of Spot Buy service.
Bruck: MercadoLibre is a leading e-commerce platform in Latin America, where we provide the largest marketplaces in 16 different countries. Our main markets are Brazil, Mexico, and Argentina, and thatâs where we are going the start this partnership with SAP Ariba.
We have upward of 60 million items listed on our platform, and this breadth of supplies will make purchasing very exciting. Latin America is a complicated market -- and we like this complexity. We do very well.
Itâs complicated because there are different rates of inflation in different countries, and so contracts can be hard to complete. What we bring to the table is an assortment of great payment and shipping solutions that make it easy for companies to purchase items. As Tony was saying, these are not under long-term contracts, but we still get to make use of this vast supply.
Gardner: Tony mentioned that maybe 15% of spend is in this category. Diego, do you think that that number might be higher in some of the markets that you serve?
Cabrera Canay: Thatâs probably the number -- but that is a big number in terms of the spend within companies. So we have to get there and see what happens.
Gardner: Tony, tell us about the partnership. What is MercadoLibre.com bringing to the table? What is Ariba bringing to the table? How does this fit together for a whole that is greater than the sum of its parts?
Alvarez: It really is a well-matched partnership. SAP Ariba is the leading cloud procurement platform, period. When you look in Latin America, our penetration with SAP Enterprise Resource Planning (ERP) is even greater. We have a very strong installed base with SAP ERP.
Our plan is to take the SAP Ariba Spot Buy content and make it available to the SAP installed base. So this goes way beyond just SAP Ariba. And when you think about what Karen mentioned -- difficulties in Latin America with high inflation -- the catalog approach is not used as much in Latin America because everything is so dynamic.
For example, you might sign a contract but in just in a couple of weeks that contract may be obsolete, or unfavorable because of a change in pricing. But once we build controls and parameters in SAP Ariba Spot Buy, you can layer that on top of MercadoLibre content, which is super-broad. If you're looking for it youâre going to find it, and that content is constantly updated. You gain real-time access to the latest information, and then the procurement person gets the benefit of control.
So I'm very optimistic. As Diego mentioned, I think 15% is really on the low-end in Latin America for this type of spend. I think this will be a really nice way to put digital catalog buying in the hands of large enterprise buyers.
Gardner: Speaking of large enterprise buyers, if I'm a purchasing official in one of your new markets, what should I be thinking about how this is going to benefit me?
Transparent, trusted transactions
It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it's a win-win ... a partnership, a match made in heaven.
Bruck: Let me talk about this from experience. As a country manager at MercadoLibre, I had to do a lot of the procurement, together with our procurement officers. It was really frustrating at times because all of these purchases had to be one-off engagements, with a different vendor every time. That takes a lot of time. You also have to bring in price comparisons, and thatâs not always a simple process.
So what this platform gives you is the ability to be very transparent about prices and among different supplies. That makes it very easy to be able to buy every time without having to call and get the vendor to be in your own buying platform.
It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, itâs a win-win. So I do believe this is a partnership, a match made in heaven.
We were also very interested in business-to-business (B2B) industries. When Tony and SAP Ariba came to our offices to offer this partnership, we thought this would be a great way to leverage their needs with our supply and make it work.
Gardner: For sellers, this enables them to do repeated business more easily, more automated and so at scale. For buyers, with transparency they have more insight into getting the best prices, the best terms of delivery. Let's expand on that win-win. Diego, tell us about the business benefits for all parties.
Big and small, meet at the mall
Cabrera Canay: In the past few years, we have been working to make MercadoLibre the biggest âmallâ in e-commerce. We have the most important brands and the most important retailers selling through MercadoLibre.
What differentiates us is that we are confident we have the best prices -- and also other great services such as free shipping, easy payments, and financing. We are sure that we can offer the buyers better purchasing.
Obviously, from the side of sellers, this all provides higher demand, it raises the bar in terms of having qualified buyers, and then giving the best services. Thatâs very exciting for us.
Gardner: Tony, we mentioned large enterprises, but this cuts across a great deal more of the economy, such as small- to medium sized (SMB) businesses. Tell us about how this works across diverse economies where there are large players but lots of small ones, too?
Alvarez: On the sales side, this gives really small businesses opportunity to reach large enterprise buyers that probably werenât there before.
Diego was being modest, but MercadoLibre's payment structure, MercadoPago, is incredibly robust, and it's incredibly valuable to that end-seller, and also to the buyer.
Just having that platform and then connecting -- you are basically taking two populations, the large and small sellers, and the large and small buyers, and allowing them to commingle more than they ever had in the past.
Gardner: Karen, as you mentioned from your own experience, when you're dealing with paper, and you are dealing with one-offs, it's hard to just keep track of the process, never mind to analyze it. But when we go digital, when we have a platform, when we have business networks at work, then we can start to analyze things for companies -- and more broadly into markets.
How do you see this partnership accelerating the ability to leverage analytics, leverage some of the back-end platform technologies with SAP HANAand SAP Ariba, and making more strides toward productivity for your customers?
Bruck:Right. When everything is tracked, as this will be, because every single purchase will be inside their SAP Ariba platform, it is all part of your âbig data.â So then you can actually drop it, control it, analyze it, and say, âHey, maybe these particular purchases mean that we should have long-term contracts, or that our long-term contracts were not priced correctly,â and maybe that's an opportunity to save money and lower costs.
So once you can track data, you can do a lot of things, and discover new opportunities for either being more efficient or reducing costs â and that's ultimately what we all want in all the departments of our companies.
Gardner: And for those listeners and readers who are interested in taking advantage of these services, and ultimately that great ability to analyze, what should they be doing now to get ready? Are there some things they could do culturally, organizationally, in order to become that more digital business when these services are available to them?
Paper is terrible for companies; you have to rethink your purchase processing in a digital way.
Cabrera Canay: I can talk about in our own case, where we are rebuilding our purchase processes. Paper is terrible for companies; you have to rethink your purchase processing in a digital way. Once you do it, SAP Ariba is a great solution, and with SAP Ariba Spot Buy we will have the best conditions for the buyers.
Bruck: Itâs a natural process. People are going digital and embracing these new trends and technologies. It will make them more efficient. If they get up to speed quickly, it will become less about controlling stuff that they don't need to control. They will really understand the benefits, so it will be a natural adoption.
Gardner: Tony, coming back full circle, as you have rolled SAP Ariba Spot Buy out from North America to Europe to Asia-Pacific, and now to Latin America -- what have you learned in the way people use it?
Alvarez: First, at a macro level, people have found this to be a useful tool to replace some of the contracts that were less important, and so they can rely on marketplaces.
Second, weâve really found as weâve deployed in the US that a lot of times multinational companies are like, âHey, that's great, I love this, but I really want to use this in Latin America.â So they want to go and get visibility elsewhere.
Third, they want a tool that doesn't require any training. If Iâm a procurement professional, I want my users to already be expert at using the tool. We've designed this in the process context, and in concert with the content partners. You can just walk up and start using it. You donât have to be an expert, and it keeps you within the guardrails without even thinking about it.
Gardner: And being a cloud-based, software-as-a-service (SaaS) solution you're always analyzing how it's being used -- going after that ultimate optimized user experience -- and then building those improvements back in on a constant basis?
The next BriefingsDirect digital business thought leadership panel discussion explores new ways that companies can gain improved visibility, analytics, and predictive responses to better manage supply chain risk in the digital economy.
The panel examines how companies such asNielsen are using cognitive computing search engines, and even machine learning and artificial intelligence (AI), to reduce risk in their overall buying and acquisitions.
Gardner: Padmini, we heard at SAP Ariba LIVE that risk is opportunity. That stuck with me. Are the technologies really now sufficient that we can fully examine risks to such a degree that we can turn that into a significant business competitive advantage? That is to say, those who take on risk seriously, can they really have a big jump over their competitors?
Ranganathan:I come from Silicon Valley, so we have to take risks for startups to grow into big businesses, and we have seen a lot of successful entrepreneurs do that. Clearly, taking risks drives bigger opportunity.
But in this world of supplier and supply chain risk management, itâs even more important and imperative that the buyer and supplier relationships are risk-aware and risk-free. The more transparent that relationship becomes, the more opportunity for driving more business between those relationships.
That context of growing business -- as well as growing the trust and the transparent relationships -- in a supply chain is better managed by understanding the supplier base. Understanding the risks in the supplier base, and then converting them into opportunities, allows mitigating and solving problems jointly. By collaborating together, they form partnerships.
Gardner: Dan, it seems that what was once acceptable risk can now be significantly reduced. How do people in procurement and supply chain management know what acceptable risk is -- or maybe they shouldnât accept any risk?
Adamson:My roots are also from Silicon Valley, and I think you are absolutely right that at times you should be taking risks -- but not unnecessarily. What the procurement side has struggled with -- and this is from me jumping into financial institutions where they treat risk very differently through to procurement â is risk versus the price-point to avoid that risk. Thatâs traditionally been the big problem.
For every vendor that you on-board, you have to pay $1,000 for a due diligence report and it's really not price-effective. But, being able to maintain and monitor that vendor on a regular basis at acceptable cost â then there's a real risk-versus-reward benefit in there.
What we are helping to drive are a new set of technology solutions that enable a deeper level of due diligence through technology, through cognitive computing, that wasn't previously possible at the price point that makes it cost-effective. Now it is possible to clamp down and avoid risk where necessary.
Gardner: James, as a consumer of some of these technologies, do you really feel that there has been a significant change in that value equation, that for less money output you are getting a lot less risk?
Knowing what you're up against
Johnson: To some degree that value was always there; it was just difficult to help people see that value. Obviously tools like this will help us see that value more readily.
It used to be that in order to show the value, you actually had to do a lot of work, and it was challenging. What we are talking about here is that we can begin to boil the ocean. You can test these products, and you can do a lot of work just looking at test results.
And, it's a lot easier to see the value because you will unearth things that you couldn't have seen in the past.
Whereas it used to take a full-blown implementation to begin to grasp those risks, you can now just test your data and see what you find. Most people, once they have their eyes wide open, will be at least a little more fearful. But, at the same time -- and this goes back to the opportunity question you asked -- they will see the opportunity to actually tackle these risks. Itâs not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.
Gardner:So rather than avoid the entire process, now you can go at the process but with more granular tools to assess your risks and then manage them properly?
Johnson:That's right. I wouldn't say that we should have a risk-free environment; that would cost more money than weâre willing to pay. That said, we should be more conscious of what we're not yet willing to pay for.
Rather than just leaving the risk out there and avoiding business where you canât access information about what you don't know -- now you'll know something. It's your choice to decide whether or not you want to go down the route of eliminating that risk, of living with that risk, or maybe something in between. That's where the sweet spot is. There are probably a lot of intermediate actions that people would be taking now that are very cheap, but they haven't even thought to do so, because they havenât assessed where the risk is.
Gardner: Padmini, because we're looking at a complex landscape -- a supply chain, a global supply chain, with many tiers -- when we have a risk solution, it seems that it's a team sport. It requires an ecosystem approach. What has SAP Ariba done, and what is the news at SAP Ariba LIVE? Why is it important to be a team player when it comes to a fuller risk reduction opportunity?
Ranganathan:You said it right. The risk domain world is large, and it is specialized. The language that the compliance people use in the risk world is somewhat similar to the language that the lawyers use, but very different from the language that the information technology (IT) security and information security risk teams use.
The reason you canât see many of the risks is partly because the data, the information, and the fragmentation have been too broad, too wide. Itâs also because the type of risks, and the people who deal with these risks, are also scattered across the organization.
Itâs not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.
So a platform that supports bringing all of this together is number one. Second, the platform must support the end-to-end process of managing those supply chain relationships, and managing the full supply chain and gain the transparency across it. Thatâs where SAP Ariba has headed with Direct Materials Sourcing and with getting more into supply chain collaboration. Thatâs what you heard at SAP Ariba LIVE.
We all understand that supply chain much better when we are in SAP Ariba, and then you have this ecosystem of partners and providers. You have the technology with SAP and HANA to gain the ability to mash up big data and set it in context, and to understand the patterns. We also have the open ecosystem and the open source platform to allow us to take that even wider. And last but not the least, there is the business network.
So itâs not just between one company and another company, it's a network of companies operating together. The momentum of that collaboration allows users to say, âOkay, I am going to push for finding ethical companies to do business with,â -- and then that's really where the power of the network multiplies.
Gardner: Dan, when a company nowadays buys something in a global supply chain, they are not just buying a product -- they are buying everything that's gone on with that product, such as the legacy of that product, from cradle to PO. What is it that OutsideIQ brings to the table that helps them get a better handle on what that legacy really is?
Dig deep, reduce risk, save time
Adamson: Yes, and they are not just buying from that seller, they are buying from the seller that sold it to that seller, and so they are buying a lot of history there -- and there is a lot of potential risk behind the scenes.
Thatâs why this previously has been a manual process, because there has been a lot of contextual work in pulling out those needles from the haystack. It required a human level of digging into context to get to those needles.
The exciting thing that we bring is a cognitive computing platform thatâs trainable -- and it's been trained by FinCrimeâs experts and corporate compliance experts. Increasingly, supply management experts help us know what to look for. The platform has the capability to learn about its subject, so it can go deeper. It can actually pivot on where it's searching. If it finds a presence in Afghanistan, for example, well then that's a potential risk in itself, but it can then go dig deeper on that.
And that level of deeper digging is something that a human really had to do before. This is the exciting revolution that's occurring. Now we can bring back that data, it can be unstructured, it can be structured, yet we can piece it together and provide some structure that is then returned to SAP Ariba.
The great thing about the supply management risk platform or toolkit that's being launched at SAP Ariba LIVE is that thereâs another level of context on top of that. Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply as well.
How you determine risk scores on top of all of that is very critical. You need to weed out all of the noise, otherwise it would be a huge data science exercise and everyone would be spinning his or her wheels.
SAP Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply.
This is now a huge opportunity for clients like James to truly get some low-hanging fruit value, where previously it would have been literally a witch-hunt or a huge mining expedition. We are now able to achieve this higher level of value.
Gardner: James, Dan just described what others are calling investigative cognitive computing brought to bear on this supply chain risk problem. As someone who is in the business of trying to get the best tools for their organization, where do you come down on this? How important is this to you?
Johnson: It's very important. I have done the kinds of investigations that he is talking about. For example, if I am looking at a vendor in a high-risk country, particularly a small vendor that doesn't have an international presence that is problematic for most supplier investigations. What do I do? I will go and do some of the investigation that Dan is talking about.
Now I'm usually sitting at my desk in Chicago. I'm not going out in the world. So there is a heightened level of due-diligence that I suspect neither of us are really talking about here. With that limitation, you want to look up not only the people, you want to look up all their connections. You might have had a due-diligence form completed, but that's an interested party giving you information, what do you do with it?
Well, I can run the risk search on more than just the entity that I'm transacting with. I am going to run it on everyone that Dan mentioned. Then I am going to look up all their LinkedIn profiles, see who they are connected to. Do any of those people show any red flags? Iâd look at the bank that they use. Are there any red flags with their bank?
I can do all that work, and I can spend several hours doing all that work. As a lawyer I might dig a little deeper than someone else, but in the end, it's human labor going into the effort.
Gardner: And that really doesn't scale very well.
Johnson: That does not scale at all. I am not going to hire a team of lawyers for every supplier. The reality here is that now I can do some level of that time-consuming work with every supplier by using the kind of technology that Dan is talking about.
The promise of OutsideIQ technologyis incredible. It is an early and quickly expanding, opportunity. It's because of relationships like the one between SAP Ariba and OutsideIQ that I seea huge opportunity between Nielsen and SAP Ariba. We are both on the same roadmap.
Nielsen has a lot of work to do, SAP Ariba has a lot of work to do, and that work will never end, and thatâs okay. We just need to be comfortable with it, and work together to build a better world.
Gardner: Tell us about Nielsen. Then secondarily, what part of your procurement, your supply chain, do you think this will impact best first?
Automatic, systematic risk management
Johnson: Nielsen is a market research company. We answer two questions: what do people watch? And what do people buy? It sounds very simple, but when you cover 90% of the worldâs population, which we do â more than six billion people -- you can imagine that it gets a little bit more complicated.
We house about 54 petabytes of database data. So the scale there is huge. We have 43,000 employees. Itâs not a small company. You might know Nielsen for the set-top boxes in the US that tell what the ratings were overnight for the Super Bowl, for example, but itâs a lot more than that. And you can imagine, especially when you're trying to answer what do people buy in developing countries with emerging economies? You are touching some riskier things.
In terms of what this SAP Ariba collaboration can solve for us, the first quick hit is that we will no longer have to leverage multiple separate sources of information. I can now leverage all the sources of information at one time through one interface. It is already being used to deliver information to people who are involved in the procurement chain. That's the huge quick win.
The secondary win is from the efficiency that we get in doing that first layer of risk management. Now we can start to address that middle tier that I mentioned. We can respond to certain kinds of risk that, today, we are doing ad-hoc, but not systematically. There is that systematic change that will allow us to not only target the 100 to 200 vendors that we might prioritize -- but the thousands of vendors that are somewhere in our system, too.
That's going to revolutionize things, especially once you fold in the environmental, social and governance (ESG) work that, today, is very focused for us.If I can spread that out to the whole supply chain, that's revolutionary. There are a lot of low-cost things that you can do if you just have the information.
What is the good in the world thatâs freely available to me, that I'm not even touching? That's amazing.
So itâs not always a question of, âam I going to do good in the world and how much is it going to cost me?â Itâs really a question of, âWhat is the good in the world thatâs freely available to me, that I'm not even touching?â That's amazing! And, that's the kind of thing that you can go to work for, and be happy about your work, and not just do what you need to do to get a paycheck.
Gardner: Itâs not just avoiding the bad things; itâs the false positives that you want to remove so that you can get the full benefit of a diverse, rich supplier network to choose from.
Johnson: Right, and today we are essentially wasting a lot of time on suspected positives that turn out to be false. We waste time on them because we go deeper with a human than we need to. Letâs let the machines go as deep as they can, and then let the humans come in to take over where we make a difference.
Gardner: Padmini, itâs interesting to me that he is now talking about making this methodological approach standardized, part of due-diligence that's not ad-hoc, itâs not exception management. As companies make this a standard part of their supply chain evaluations, how can we make this even richer and easier to use?
Ranganathan: The first step was the data. Itâs the plumbing; we have to get that right. Itâs about the way you look at your master data, which is suppliers; the way you look at what you are buying, which is categories of spend; and where you are buying from, which is all the regions. So you already have the metrics segmentation of that master data, and everything else that you can do with SAP Ariba.
The next step is then the process, because itâs really not a one-size-fits-all. It cannot be a one-size-fits-all, where every supplier that you on-board you are going to ask them the same set of questions, check the box and move on.
I am going to use the print service vendor example again, which is my favorite. For marketing materials printing, you have a certain level of risk, and that's all you need to look at. But you still want, of course, to look at them for any adverse media incidents, or whether they suddenly got on a watch-list for something, you do want to know that.
But when one of your business units begins to use them for customer-confidential data and statement printing -- the level of risk shoots up. So the intensity of risk assessments and the risk audits and things that you would do with that vendor for that level of risk then has to be engineered and geared to that type of risk.
So it cannot be a one-size-fits-all; it has to go past the standard. So the standardization is not in the process; the standardization is in the way you look at risk so that you can determine how much of the process do I need to apply and I can stay in tune.
Gardner: Dan, clearly SAP Ariba and Nielsen, they want the âdials,â they want to be able to tune this in. Whatâs coming next, what should we expect in terms of what you can bring to the table, and other partners like yourselves, in bringing the rich, customizable inference and understanding benefits that these other organizations want?
Constructing cognitive computing by layer
Adamson: We are definitely in early days on the one hand. But on the other hand, we have seen historically many AI failures, where we fail to commercialize AI technologies. This time it's a little different, because of the big data movement, because of the well-known use cases in machine learning that have been very successful, the pattern matching and recommending and classifying. We are using that as a backbone to build layers of cognitive computing on top of that.
And I think as Padmini said, we are providing a first layer, where itâs getting stronger and stronger. We can weed out up to 95% of the false-positives to start from, and really let the humans look at the thorny or potentially thorny issues that are left over. Thatâs a huge return on investment (ROI) and a timesaver by itself.
But on top of that, you can add in another layer of cognitive computing, and that might be at the workflow layer that recognizes that data and says, âJeez, just a second here, there's a confidentiality potential issue here, let's treat this vendor differently and let's go as far as plugging in a special clause into the contract.â This is, I think, where SAP Ariba is going with that. Itâs building a layer of cognitive computing on top of another layer of cognitive computing.
Actually, human processes work like that, too. There is a lot of fundamental pattern recognition at the basis of our cognitive thought, and on top of that we layer on top logic. So itâs a fun time to be in this field, executing one layer at a time, and it's an exciting approach.
Stay with us now as we develop a new vision for how today's cutting-edge technologies will usher in tomorrow's most powerful business tools and processes. The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.
Sanjay Almeida, Senior Vice President and Chief Product Officer of Network Solutions.
Here are some excerpts:
Gardner: It seems like only yesterday we were confident to have a single view of a customer, or clean data, or maybe a single business process endâto-end value. But now, we are poised to leapfrog the status quo by using words like predictive and proactivefor many business functions.
Why are AI and ML such disrupters to how we've been doing business processes?
Shahane: If you look back, some of the technological impact in our private lives, is impacting our public life. Think about the amount of data and signals that we are gathering; we call it big data.
We not only do transactions in our personal life, we also have a lot of content that gets pushed at us. Our phone records, our location as we move, so we are wired and we are hyper-connected.
Similar things are happening to businesses. Since we are so connected, a lot of data is created. Having all that big data â and it could be a problem from the privacy perspective -- gives you an opportunity to harness that data, to optimize it and make your processes much more efficient, much more engaged.
If you think about dealing with big data, you try and find patterns in that data, instead of looking at just the raw data. Finding those patterns collectively as a discipline is called machine learning. There are various techniques, and you can find a regression pattern, or you can find a recommendation pattern -- you can find all kinds of patterns that will optimize things, and make your experience a lot more engaging.
If you combine all these machine learning techniques with tools such as natural language processing (NLP), higher-level tools such as inference engines, and text-to-speech processing -- you get things like Siriand Alexa. It was created for the consumer space, but the same thing could be available for your businesses, and you can train that for your business processes. Overall, these improve efficiency, give delight, and provide a very engaging user experience.
Gardner:Sanjay, from the network perspective it seems like we are able to take advantage of really advanced cloud services, put that into a user experience that could be conversational, like we do with our personal consumer devices.
What is it about the cloud services in the network, however, that are game-changers when it comes to applying AI and ML to just good old business processes?
Almeida:Building on Dineshâs comment, we have a lot of intelligent devices in our homes. When we watch Netflix, there are a lot of recommendations that happen. We control devices through voice. When we get home the lights are on. There is a lot of intelligence built into our personal lives. And when we go to work, especially in an enterprise, the experience is far different. How do we make sure that your experience at home carries forward to when you are at work?
From the enterprise and business networks perspective, we have a lot of data; a lot of business data about the purchases, the behaviors, the commodities. We can use that data to make the business processes a lot more efficient, using some of the models that Dinesh talked about.
How do we actually do a recommendation so that we move away from traditional search, and take action on rows and columns, and drive that through a voice interface? How do we bring that intelligence together, and recommend the next actions or the next business process? How do we use the data that we have and make it a more recommended-based interaction versus the traditional forms-based interaction?
Gardner: Sudhir, when we go out to the marketplace with these technologies, and people begin to use them for making better decisions, what will that bring to procurement and supply chain activities? Are we really talking about letting the machines make the decisions? Where does the best of what machines do and the best of what people do meet?
Bhojwani: Quite often I get this question, What will be the role of procurement in 2025? Are the machines going to be able to make all the decisions and we will have no role to play? You can say the same thing about all aspects of life, so why only procurement?
I think human intelligence is still here to stay. I believe, personally, it can be augmented. Let's take a concrete example to see what it means. At SAP Ariba, we are working on a product called product sourcing. Essentially this product takes a bill of material (BOM), and it tells you the impact. So what is so cool about it?
One of our customers has a BOM, which is an eight-level deep tree with 10 million nodes in it. In this 10 million-node commodity tree, or BOM, a person is responsible for managing all the items. But how does he or she know what is the impact of a delay on the entire tree? How do you visualize that?
I think humans are very poor at visualizing a 10-million node tree; machines are really good at it. Well, where the human is still going to be required is that eventually you have to make a decision. Are we comfortable that the machine alone makes a decision? Only time will tell. I continue to think that this kind of augmented intelligence is what we are looking for, not some machine making complete decisions on our behalf.
Gardner: Dinesh, in order to make this more than what we get in our personal consumer space, which in some cases is nice to have, it doesn't really change the game. But we are looking for a higher productivity in business. The C-Suiteis looking for increased margins; they are looking for big efficiencies. What is it from a business point of view that these technologies can bring? Is this going to be just a lipstick on a pig, so to speak, or do we really get to change how business productivity comes about?
Humans and machines working together
Shahane: I truly believe it will change the productivity. The whole intelligence advantage -- if you look at it from a highest perspective like enhanced user experience -- provides an ability to help you make your decisions.
When you make decisions having this augmented assistant helping you along the way -- and at the same time dealing with large amount of data combined in a business benefit -- I think it will make a huge impact.
Let me give you an example. Think about supplier risk. Today, at first you look at risk as the people on the network, and how you are directly doing business with them. You want to know everything about them, their profile, and you care about them being a good business partner to you.
But think about the second, third and fourth years, and some things become not so interesting for your business. All that information for those next years is not directly available on the network; that is distant. But if those signals can be captured and somehow surface in your decision-making, it can really reduce risk.
Reducing risk means more productivity, more benefits to your businesses. So that is one advantage I could see, but there will be a number of advantages. I think we'll run out of time if we start talking about all of those.
Gardner:Sanjay, help us better understand. When we take these technologies and apply them to procurement, what does that mean for the procurement people themselves?
Almeida: There are two inputs that you need to make strategic decisions, and one is the data. You look at that data and you try to make sense out of it. As Sudhir mentioned, there is a limit to human beings in terms of how much data processing that they can do -- and that's where some of these technologies will help quite a bit to make better decisions.
The other part is personal biases, and eliminating personal biases by using the data. It will improve the accuracy of your strategic decisions. A combination of those two will help make better decisions, faster decisions, and procurement groups can focus on the right stuff, versus being busy with the day-to-day tasks.
Using these technologies, the data, and the power of the data from computational excellence -- that's taking the personal biases out of making decisions. That combination will really help them make better strategic decisions.
Bhojwani: Let me add something to what Sanjay said. One of the biggest things we're seeing now in procurement, especially in enterprise software in general, is people's expectations have clearly gone up based on their personal experience outside. I mean, 10 years back I could not have imagined that I would never go to a store to buy shoes. I thought, who buys shoes online? Now, I never go to stores. I don't know when was the last time I bought shoes anywhere but online? It's been few years, in fact. Now, think about that expectation on procurement software.
Currently procurement has been looked upon as a gatekeeper; they ensure that nobody does anything wrong. The problem with that approach is it is a âstickâ model, there is no âcarrotâ behind it. What users want is, âHey, show me the benefit and I will follow the rules.â We can't punish the entire company because of a couple of bad apples.
By and large, most people want to follow the rules. They just don't know what the rules are; they don't have a platform that makes that decision-making easy, that enables them to get the job done sooner, faster, better. And that happens when the user experience is acceptable and where procurement is no longer looked down upon as a gatekeeper. That is the fundamental shift that has to happen, procurement has to start thinking about themselves as an enabler, not a gatekeeper. That's the fundamental shift.