-->
Showing posts with label information. Show all posts
Showing posts with label information. Show all posts

Monday, 6 January 2014

Project Management isn’t just for IT or Engineering anymore

Post written by Jason Z., Project Manager at Ideaca. Read more about project management on his blog: Unnatural Leadership.
As part of this month’s Ideaca blogging network challenge, we were tasked with discussing our thoughts on Emerging Practices.
One of my favorite quotes to reference from the The Project Management Body of Knowledge (PMBOK, pronounced pemmmmbock) is “As project management is a critical strategic discipline, the project manager becomes the link between the strategy and the team. Projects are essential to the growth and survival of organizations.” So, while operational duties are of very high importance to maintaining the forward momentum and revenue generation for a company, projects are strategic and help organizations react to changes in the external environment that may slow forward momentum and/or impair revenue generation.
Taking this as rote, one Emerging Practice that I am pleased to see is that more industries and functions – outside of Engineering and IT – are recognizing the need for project management:
So what does this mean for Project Management as a career? It means that effective Project Management is not just for IT and Engineering anymore. In fact, the rest of the organization is going to have to contend with:
  • Increased workloads for Subject Matter Experts. If you know the organizational area, you must know how to manage the project to do something in this organizational area.
  • Gone are the days of black box projects – clients are demanding more visibility into what is being delivered, how it’s being delivered, and how delivery is progressing.
  • Organizations are demanding value from their staff’s time - projects are going to have to deliver more than a “thing.”
  • Successfully implementing changes in an organization can no longer be ad-hoc, and to a lesser extent grassroots. Rather, efforts must be controlled activities.
This is both amazing, and troubling at the same time. It’s amazing because having proper control, visibility, and communication for organizations can return recognizable and material value. It’s troubling though, as many organizations may start expecting their people to be expert project managers without any proper training or experience (this link is a great discussion on LinkedIn, by the way).
If your organization is transitioning to more of a project focus, and you don’t have the time or desire to become a fully trained PMP, there are a number of ways to get up to speed on how to be effective:
  • Hire a dedicated (or shared) Project Manager – This person should be able to apply project management best practices while you are focused on the subject matter at hand. If your department doesn't have the budget or enough work for a full time Project Manager, share the PM (both cost and time) with a different department.
  • Mentoring – Junior PMs will often work with Senior PMs for mentoring, so why not do the same? Your company should have a PM for you to reach out to, or you can contact someone in your local PMI chapter.
  • Training – Most colleges offer introductory PM training. In exchange for some of your time over a couple of weeks, you can get trained up on how to run a small project effectively.
  • Reading – There are many great books available. One that I recommend is Project Management Lite: Just Enough to get the Job Done…Nothing more. Another, more detailed, is the big bible - Rita Mulcahy’s guide to passing the PMP on your first try. You don’t have to attempt the PMP, you just need to read this book.
Has your organization made the transition to more project-based initiatives?  How has it impacted you?  What have you learned?

Friday, 8 November 2013

Is Big Data Only About…Big Data?

Post written by Wade W., BI Consultant at Ideaca. Read more about BI on his blog: Pragmatic Business Intelligence.  

If nothing else, IT is all about buzzwords, and “Big Data” is one of the new arrivals to the party.
It is, however, a descriptive one. “Big Data” evokes images of enormous relational databases, providing analytical (or operational) reporting.

Big Data is not only about size however. Rather, it refers to attributes of the data that together challenge the constraints of a business need or system to respond to it. Those attributes can include any or all of attributes such as size/volume (of data), speed (of generation), and number and variety of systems or applications that simultaneously generate data. Another thing that is unique about Big Data is how it varies in structure. Elements of “structure” would include the diversity of its generation (eg. Social media, video, images, manual text, automatically generated data, such as a weather forecast, etc), information interconnectedness and interactivity.

I heard somewhere a thumbnail statistic that 80% of data in companies is unstructured or semi-structured. Just to clarify the meanings of those terms, an unstructured data artifact would be a document, an email, a video or audio clip. A semi-structured data artifact would include data that does not conform to the norms of structured data but contains markers or tags that enforce some kind of loose (or not so loose) structure. XML documents would be an example of semi-structured data.  Tagged documents in a Knowledge Management system would also fit into this definition.

Structured data is what we would find in any database – a Data Model has been defined and the data is physically arranged within this model into tables. The data in these tables is described with metadata (i.e. data types (such as “character”) and the maximum length of that data (number of bytes)).

The methods of data creation are multiplying and the velocity of its creation are increasing. And that, in itself is a complicating factor. Some analysts (IDC, for example), predict that the Digital Universe -  that is, the world’s data – will increase by 50x by 2020. There will be in the same period, a growing shortage of storage, which will drive investment in the cloud as both individuals and corporations look for scalable, ubiquitously accessible, lower-cost and environmental data storage options. In addition, the same study predicts that of all that data, unstructured data, especially video, will account for 90% of that data.

There is also an important historical dimension to Big Data. For decades, companies have been hoarding structured, semi-structured and unstructured data in hopes of one day being able to extract value from it at some point in the future.

A large percentage of all this data will come with a wrapper of automatically generated Metadata – that is, (as indicated above), data about (or that describes) that data. A practical example could be the generation of a data artifact coming wrapped with metadata from those GPS enabled, media rich, socially linked mobile devices we all carry with us that transparently capture location, GPS coordinates, time, weather conditions and a plethora of other data elements when you click that holiday photo with your mobile phone. IDC predicts that such metadata is growing twice as fast as data.

It is clear from the last three paragraphs that Big Data describes explosive growth in data and metadata and an equally explosive opportunity to capture, tame and corral that data to extract value from it.

So the case has been made that we have a lot of data today and we will have even way more tomorrow, but should your organization be investing in Big Data today?

In a sense, probably you already are. Enterprise Business Intelligence environments lay a solid foundation for the next phase of Big Data. EBI is an earlier iteration of Big Data and, married to tools such as Hadoop and NoSQL databases for example, enable a natural evolutionary growth curve to your mastery of your information ecosystem.

Big Data has a requirement for a new way of thinking, new tools, clustered commodity hardware and probably, substantial investment. It comes down to your business, and if there is a clear value-based case to present that data to your company’s brainpower. The actual needs for this will be radically different depending on your industry. Oil and Gas may be interested in leveraging real time alerts in wellhead data or analyzing petabyte seismic datasets.  Packaged Goods multinationals may be interested in monitoring and engaging advocates, detractors and influencers across multiple Social Media platforms, mining and understanding sentiment and identifying problem areas in real time in order to identify opportunity or identify and avert potential brand-damaging events. Financial institutions may be interested in monitoring international money traffic to identify fraud or illegal activity. Government entities may mine extremist forums, or other unstructured data traffic to identify national threats.

Big Data can serve these needs in real time, enabling rapid (or even automated) response to flagged events. Whether it is a fit for your organization today would be determined through viewing your industry and business through a critical lens on your current Information Intelligence maturity, a strategic assessment of the data and information assets currently owned or available to your organization, and a prioritization of potential initiatives. How much data you harness and convert into information should be a key outcome required from this exercise. The opportunities are legion, but initiatives should have clear objectives and success metrics understood prior to a project kickoff.
Whether it is today or tomorrow, Big Data is becoming mainstream through necessity. Whether that is a road your organization wants, or needs to drive today, is something all medium and large organizations should be considering now.

What are your thoughts on Big Data? Is your organization currently considering Big Data as a strategic imitative or Proof of Concept?

Thursday, 26 September 2013

Social Analytics meet Business Intelligence

 Post written by Wade W., BI Consultant at Ideaca. Read more about BI on his blog: Pragmatic Business Intelligence.

If your company is a well-known brand, somebody, somewhere is publicly talking about it. Right now. It may be on your own Social channels, in an Internet forum, a blog or other user-generated content site. Social Media Monitoring, which put simply is keeping a constant eye on Social sites including Twitter, Facebook and hundreds of other platforms to monitor what is being said about your brand, has become a necessity for most large organizations, and it is an art and a science to manage this well.  Manage it badly, and you can have a catastrophic image issue (i.e. the 2010 Nestle Palm Oil debacle on Nestle’s own  Facebook page).  Handle it well, and you can cement a solid relationship with existing clients and convert new clients to your brand (i.e. HP’s little-known but truly brilliant efforts to provide temporary replacement HP hardware to certain individual users on social platforms complaining of broken computers).

From a commercial aspect, companies are increasingly looking at Social Media to contribute to driving revenue, largely through lofty concepts such as “engagement” and “conversion.” Social is unique in not only the speed of the communication, but also the intimate nature of content.  In addition, and importantly, what companies must understand is that in the Social realm, the customer controls the conversation. The implication here is a fundamental paradigm shift for Customer Relationship Management and Marketing, to understand the customer on a personal level, and to handle – with great sensitivity – both the positive and negative sentiment expressed on Social platforms.

(Social) Business Intelligence
A growing and compelling new flavor of Business Intelligence is attempting to tap into the unstructured content on social platforms and attempt to structure that data into a format that can be analyzed and mined using new methods such as Sentiment Analysis, which measures the aggregate sentiment across user posted content. Social Business Intelligence uniquely sits in the convergence of Knowledge Management, Social Media Monitoring, Collaboration, Social Networking, Analytics , Customer Relationship Management (CRM) and Business Intelligence (BI).

 Social Business Intelligence is at a unique convergence point between several key technologies.

Social Business Intelligence is at a unique convergence point between several key technologies.

First, there is an important roadblock to get out of the way. Today there are a selection of tools to do everything I am discussing below in one way or another.  With a simple sentence I have rendered technology irrelevant for the purposes of this blog. So let’s focus on what Social BI is, how it is done and what it means because that is what is important to business.

I’m not really a catch-word kind of guy, but this is Big Data in its truest form. There are thousands of platforms and sites, of course, but if we only talk about  the current Big Guys (Facebook, Twitter and Foursquare for example), this would add up to billions or trillions of conversation segments over a given  (even conservative) time horizon. To put this in context: that customer data warehouse you have built over all these years probably doesn’t come close…

Social Business Intelligence offers both Internal and External Opportunity
There are both internal and external opportunities to be realized through Social Media Business Intelligence, and many tools are evolving to support these, some even going so far as to adopt a “Facebook-like” or “Twitter-like” interface, mimicking social interaction and Social Networking site features.

Social Business Intelligence applied internally to an organization could be termed Social Collaboration. For example, certain tools might feature collaborative review where colleagues can ask questions and link those answers to specific reports, or collaboratively comment and markup objects such as Business Intelligence ad-hoc analytics,  graphs or reports. This functionality to comment in real time on powerful business intelligence (even if it is only based on Traditional data sources that exclude Social Media data) has the potential to add value to interpretation of the reports that companies produce and use today to base key decisions upon, thereby potentially improving decisions made from today’s Decision Support Systems. Many traditional software vendors already have adopted such functionality.

Of course, where Social Business Intelligence as a disruptive technology becomes particularly interesting is when we start gathering and analyzing that unstructured user-generated content, or even more compelling, when we combine it with our existing “traditional” Enterprise Analytics environments. This empowers organizations to produce new innovative products that target user segments more accurately and respond better to customer support or relationship development opportunities. The value of Social Business Intelligence is not really “about” the frequency of words and phrases users post on social platforms. The value is in segmenting, categorizing, mining and understanding the aggregate of the users’ behavior, and the sentiment of those posts across products, segments and channels.

Social Media has its own unique segments, which include Employees, Partners, Influencers, Detractors and Advocates. We can analyze social network traffic, understand and identify our segments, and tailor personalized/semi-personalized interaction to individuals or one of these segments,  flagging key comments, monitoring Likes, +1’s, trending subjects and use of hashtags, enabling rapid and targeted response to user comments to avert public relations crisis, measure success of our Social Marketing programs or capitalize on new opportunities.

It’s all about the conversation. And you don’t control it.
Again, companies need to understand that the customer controls the conversation. However, the tools exist that can arrange and present structured knowledge from unstructured noise, providing key information input to areas such as Marketing and Manufacturing to be responsive and agile, acting on data that correlates highly to real-life fact.

At its root, Social Media is about the conversation. This implies new requirements for how to manage our link to the customer, and how to most effectively target and market to them. Increasingly, consumers are mistrustful of the marketing messages and advertising. They are more likely to find more relevance and see more value in the reviews and purchasing of their friends and peers.

Social Business Intelligence in Practice
I thought to finish, I would provide two examples that support the claim that through mining user-generated content, we can correlate with very high level of confidence, to known and validated facts.

Google Flu Trends
An example of single-source user generated content analysis is  Google Flu Trends.  Google has been analyzing aggregated web search terms to see if it is possible to correlate geographic frequency of user search terms on Google’s search engine to real data on flu epidemics.

While I recognize this is not Social Media  per se, this example is very relevant to the argument that user-generated content can be tied to sentiment and can also be used as a predictor for future events, when we clearly understand and define the objective, then identify and measure indicators supporting that objective.

Google’s site http://www.google.org/flutrends/ca/#CA provides up-to-current-day results to allow tracking of current and developing flu incidents and epidemics. In addition, on this site there are historical graphs over a multi-year period for regions around the globe that prove, using known, validated historical data, that reality and future events can indisputably be predicted by user-generated content.

United Nations Global Pulse.
Between 2009 and 2011, the United Nations and SAS studied how Social Media and other user-generated content from public internet sources such as blogs, Internet forums, and news published in Ireland and the US could be correlated to validated statistics and leveraged as a compliment and an qualitative indicator of real-life events.

For Global Pulse, the focus was on employment status. To summarize from the document found at http://www.sas.com/resources/asset/un-global-pulse.pdf,  the UN identified keywords indicating changes in employment status (i.e.”fired”), level of anxiety (i.e.  “depressed”) or economic indicators (i.e. loss of housing or auto repossession, cancellation of vacations) in order to  monitor sentiment.  The results were astonishing. The analysis of sentiment allowed them to predict  increases in unemployment as much as four months in advance of an uptick in unemployment claims with a 90-95% level of confidence. Further, they were able to predict precisely, again with a 90-95% confidence, how long after an uptick in unemployment that there would be an increase in clear economic indicators in the form of talk of loss of, or negative changes to housing, changes of transport method or cancellation of travel plans.

These two examples underscore that user-generated content in the social realm represents a new and potentially highly accurate source of knowledge when tied to clearly defined objectives and supporting metrics (leveraging appropriate keywords). Indeed, Social Media Business Intelligence has the potential  to facilitate very personal customer understanding and when backed by a well defined strategy, to strengthen the relationship with our customer, avert PR disasters and increase customer engagement and conversion.

What are your thoughts? Is the world ready for Social Business Intelligence? Has your company thought about imposing order and structure to the chaos that is Social Media user-generated Content?

Tuesday, 24 September 2013

You've Collected Data...But Now What?

Post written by Peter T., Management Consultant at Ideaca. Read more about visibility on his blog: Visibility.

The list of technologies that allow us to capture vast amounts of data is quite extensive. This list varies in magnitude of use and exposure within organizations. Companies today can, and most often do, use multiple means of collecting data, such as: Spreadsheets, Databases, Operational specific Software, Enterprise Systems; ERP, CRM, HRM, Various Portals; Personal Portals, News Portals, Enterprise Information Portals, Self-Service Portals, e-Commerce Portals, Collaboration Portals… And the list goes on and on.

It is very evident that companies are really good at collecting data. Whether the data management function within an organization is primitive or advanced, gathering data in spreadsheets or in elaborate enterprise systems and databases: the majority of organizations are great at data collection. Hard copy, Soft Copy, e-Copy, web displayed; data in all forms, shapes and sizes is being collected at an enormous pace. If you can write it, print it, draw it, type it, sketch it, draft it, and capture it, you can rest assured it is being gathered.

The question is not what data to capture next, but now that we have all this data, NOW WHAT?  
Once data is collected, do organizations use it in the most efficient way? The overarching question is: now that you have all this data, what value are you getting from it? The following are five steps that will assist organizations in gaining the most value out of their data.

STEP 1 – IDENTIFY YOUR VALUE DRIVERS
Before we can successfully answer the question of value derived from data, we need to understand what the value drivers are for an organization. Are the value drivers; profitability, reputation, market share, productivity, customer service? The list can certainly be expanded upon. Getting value out of your operational data is imperative, but if you don’t link the data that you are capturing with the value drivers of the organization, you could be spinning your wheels and not realizing the full potential of your systems and efforts.

STEP 2 – LINK DATA TO YOUR VALUE DRIVERS
The next step to ensuring you are making intelligent decisions based on relevant information is to verify that all data captured is linked to the value drivers of your organization. Every piece of information that is collected and processed is intended to provide new intelligence, thereby improving the positive outcomes of critical operational decisions. The way to optimally perform this is by linking significant data retrieval and performance functions to your value drivers. Furthermore, these links can be expanded upon where multiple associations exist.

Dissecting the specific data captured will allow organizations to assess data accuracy, timeliness, depth, and most importantly the interconnection with various other data sets and systems. The key is to ensure that crucial data is modeled to display how it is gathered, at what interval, and how data from one source is related to data in another.

STEP 3 – ANALYZE
Now that you have modeled all significant operational data, you will be able to focus on the highest impacting pieces. By designing new processes or re-engineering solutions, you will be able to increase the usefulness of the information. The analysis will be focused on interconnecting data, assets, management, and operational systems. This exercise will require a thorough look at the data to ensure that standards are in place and the collection of information is from across the entire organization in order to ensure corporate-wide accurate reporting. The outcome from the analysis is to design a roadmap that will focus on operational improvements tied directly to the value drivers of the organization. This can be initiatives such as: identifying ways to increase production, improve safety records, decrease maintenance costs, improve asset visibility, reduce compliance risk, and much more.

STEP 4 – SOLUTIONING
After defining opportunities to improve operations, organizations need to devote some time to developing a realistic plan of achieving these goals. A key step in the Solutioning process is developing the overall vision and detailing the various components of development in palatable sizes ready for execution. Increasing the capabilities of the organization through the design of new automated systems or enhanced analytics, processes and interfaces are just some of the improvements that can be realized. If structured properly, these enhancements can provide the organization significant wins by capitalizing on the information captured along the way.

Information Technology has assisted organizations in navigating from simple and non-existent data management environments, to an optimized level where data can be used for benchmarking and analysis to drive their strategic and operational initiatives. This cannot be successfully done however without ensuring that all data captured provides value and that value is something that drives the automation, analysis and design of advanced systems and integration opportunities. The following diagram depicts the stages of data management and provides a visual of where organizations currently are and how far they may have to go in order to achieve the most optimal level of data management:

Data_Management

Thursday, 12 September 2013

The data has the answers

Post written by Evan Hu, Co-founder of Ideaca. View his blog here: evanhu.wordpress.com


Data_graphic_2
In a 2001 research report by META Group, Doug Laney laid the seeds of Big Data and defined data growth challenges and opportunities in a “3Vs” model. The elements of this 3Vs model include volume (the sheer, massive amount of data or the “Big” in Big Data), velocity (speed of data processed) and variety (breadth of data types and sources). Roger Magoulas of O’Reilly media popularized the term “Big Data” in 2005 by describing these challenges and opportunities. Presently Gartner defines Big Data as “high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.” Most recently IBM has added a fourth “V,” Veracity, as an “indication of data integrity and the ability for an organization to trust the data and be able to confidently use it to make crucial decisions.”

The volume of data being created in our world today is exploding exponentially. McKinsey’s 2012 paper “Big data: The next frontier for innovation, competition, and productivity” noted that:
  • to buy a disk drive that can store all of the world’s music costs $600
  • there were 5 billion mobile phones in use in 2010
  • over 30 billion pieces of content shared on Facebook every month
  • the projected growth in global data generated per year is 40% vs. a 5% growth in global IT spending
  • 235 terabytes data was collected by the US Library of Congress by April 2011
  • 15 out of 17 sectors in the United States have more data stored per company than the US Library of Congress

IBM has estimated that “Every day, we create 2.5 quintillion bytes (5 Exabyte) of data — so much that 90% of the data in the world today has been created in the last two years alone." In their book “Big Data, A Revolution That Will Transform How We Live, Work, And Think,” Viktor Mayer-Schonberger and Kenneth Cukier state that “In 2013 the amount of stored information in the world is estimated to be around 1,200 Exabytes, of which less than 2 percent is non-digital.” They describe an Exabyte of data if placed on CD-ROMs and stacked up, they would stretch to the moon in five separate piles.

This sheer volume of data presents huge challenges. For time-sensitive processes such as fraud detection, a quick response is critical. How does one find the signal in all that noise? The variety of both structured and unstructured data is ever expanding in forms: numeric file, text documents, audio, video, etc. And last, in a world where 1 in 3 business leaders lack trust in the information they use to make decisions, data veracity is a barrier to taking action.

The solution lays ever more inexpensive and accessible processing power and the nascent science of machine learning. While Abraham Kaplan (1964) principle of the drunkard’s search holds true: “There is the story of a drunkard, searching under a lamp for his house key, which he dropped some distance away. Asked why he didn’t look where he dropped it, he replied ‘It’s lighter here!’” A massive dataset that all has the same bias as a small dataset will only give you a more precise validate of a flawed answer, we are still in early days. Big Data is the opportunity to unlock answers to previously unanswerable questions and to uncover insights unseen. With it are new dangers as the NSA warrantless surveillance controversy clearly exposes.

I have had the privilege of listening to Clayton Christensen speak several times. In particular he has one common through line that stuck with me and forever embedded itself in my consciousness. “I don’t have an opinion. But I have a theory, and I think my theory has an opinion.” I believe the same for Big Data. The data has an opinion, the data has the answers.