Tuesday, November 4, 2014

Big Data and the Republican and Democratic Parties

By 2012, the Democratic Party had centralized the storage of voter data.  At the same time, the Republican Party had started a plan for building a new data platform.  Ex-Facebook engineer Andy Barkett was hired to spearhead the effort. Two notable data services, Targeted Victory and FLS Connect, were established in political realms.  In 2012, these firms billed a combined $86 million for services to the Republican presidential campaign.

There were issues with some firms withholding data and the big aforementioned firms were a bit put off by Barkett's criticisms of the party's lack of tech talent.

Several firms were involved in the construction of the data platform and data gathering.  Also now the situation has been compounded by late software delivery by Barkett and the RNC. 

In the meantime, quite a few state and local Republican candidates have turned to other sources.  Mr. Barkett has relinquished his current role in favor of developing a long-term data strategy for the RNC.

The use of data in targeting specific voters has become invaluable in political races.  Groups of voters can be targeted by what motivates them.  Their consumer behavior and media habits can be key determinant factors.

In the midterm elections, the lack of a single data source is not believed sufficient to sway the favor from the Republican Party, but may pose more of a serious issue in the next presidential election.

For the mid term election, the Democratic Party has developed different means for outreach to target voters. Key to their efforts has been to sign up occasional (do not vote every election) Democratic leaning voters for vote by mail and encourage early voting.

In Lake County Illinois, there has been a relentless effort to get the vote out.  The targets have been identified via Big Data means, and are called and personally visited by volunteers.  Results of canvassing are then fed back into the computer by volunteers after each round, thereby adding fresh voter intelligence to the Big Data pool.

What does this effort buy the Democrats?  Regarding early voting,  Wall Street Journal poll has found 25 percent of the voters in early voting states already have voted.  Of the early voters, 49 percent voted for Democratic candidates versus 40 percent for Republican candidates.

We will continue to see Big Data and its feedback loop segment and cater specific approaches to voters. The campaigns will be able to evidence the results of their efforts and adapt as they need to reach voters.  How this will evolve with the next Presidential election will be interesting to watch.


References:
'Big-Data Overhaul Jolts Old Political Party Ways'. Patrick O'Connor, WSJ, 10/22/14.

'Early Votes Mean Early Party Boasts'.  Reid J. Epstein, WSJ, 11/3/14.

Wednesday, October 22, 2014

Quality Principles Applied to Ebola Care

As mentioned in the previous post, Centers of Excellence exist for Ebola care in the US.  Given the reported issues with care outside of these facilities, it has been reported that Nurse Nina Pham was recently transported to one of these centers for care.  Others who have been recently diagnosed will be treated within one of these facilities.  Standards will also be communicated and taught to healthcare facilities.  Even a Czar has been appointed by Presidentt Obama.

So what does this tell us about quality?  When the stakes are high and there are significant risks, let expertise rule.  Employ Centers of Excellence and develop standards and communicate these.  Place a hard to manage situation with high potential for miscommunication under central governance.

Will this all contribute to better outcomes?  We will need to wait and see.

Thursday, October 16, 2014

Are Electronic Health Records (EHRs) Partly to Blame in Ebola case?

When Thomas Eric Duncan was turned away from the Dallas hospital emergency room on September 20, was there a lack of alerts within the EHR which caused an oversight?  This premise comes into play since an attending physician may not have been aware of the patient’s recent travel to Liberia.  The travel was indeed documented in the hospital’s EHR.  The physician deliberately would have known the record existed to retrieve it.

EHRs in most cases are customized to specific facilities.  They may require  30 to 60 seconds to log in.  They may portray an automated rendition of flawed paper workflows.  These systems can require many ‘click throughs’ to document the situation.  They can make the process of searching for and retrieving records more cumbersome than the processes they've replaced.

Recent regulatory pressures in the past few years have fueled the rapid adoption of EHRs.  By the end of 2013, about 58 percent of all hospitals had implemented EHRs, four times that of hospitals in 2010.

Currently, EHRs do not incorporate health alerts from the Centers for Disease Control or other agencies.  These systems generally miss the mark when it comes to sharing data with other providers – about 14 percent presently have this capability.  Deficiencies in staff training are also noted.

Taking the system solely into account, there are usability factors such as arduous maneuvering to enter and retrieve data, under often severe time pressures, as in the case of an ER.  In the attempt to be comprehensive in gathering data and soliciting information, the user is faced with a barrage of questions they must wade through.

A well developed, easy to use, and fully fit for purpose EHR would need to cut away at the periphery and keenly focus on the vital information demanded by a given scenario.  Workflows and interactions more closely synchronized with how users actually perform work are needed.  Alerts from outside agencies would be incorporated and information sharing and standards with partner healthcare providers would have seamless integration.

As seasoned quality professionals are well aware, an automated system is only a part of an overall, integrated whole.  For it to work as intended, system users and stakeholders need to interact amongst themselves; they should converse with each other in matters requiring attention.  General protocols also require adherence.   For example, sending any patient home from an ER with a 103 degree fever may not be a good idea.  It may warrant further investigation and patient care.

The challenges posed by a possible epidemic most likely required direct involvement of the CDC and the greater expertise in facility processes and controls.  The CDC director, in retrospect, regrets not placing a team at the Texas hospital to monitor infection control.  As the story unfolds, we are learning specific expertise and facilities provide the best care for Ebola patients and decrease the risks of spreading infectious disease.  There are only four medical facilities in the US having specialized treatment centers and experience with deadly infectious diseases.  The facilities are:  Atlanta’s Emory University, Omaha Medical Center, St. Patrick’s Medical Center in Missoula, MT and a National Institute of Health facility.  These facilities are said to exceed the standards required by the Ebola contagion.

The inability of the EHR to alert the attending physician is dwarfed significantly by omissions in human communication, awareness, and training.  Existing challenges with this particular EHR have been unduly distorted in the face of the complex web of human interactions, skill, and judgement of health care providers and agencies facing an Ebola contagion.  What has transpired with undeserved emphasis on the shortcomings of an EMR may spur improvements in EHRs overall in the future.


References
Are e-health records at fault for Ebola mistakes?  Computerworld, 10/8/2014
http://www.computerworld.com/article/2823713/are-e-health-records-at-fault-for-ebola-mistakes.html?source=CTWNLE_nlt_pm_2014-10-08

Treatment Spotlight Back in Atlanta, WSJ, 10/16/2014

Dallas Warns More Cases Possible, WSJ, 10/16/14

Wednesday, July 2, 2014

What are examples of "universal principles" of Quality Assurance?



1.       Measure early and often
2.       Correcting a problem as early as possible will minimize other issues, save costs
3.       Consists of a cycle of observing, measuring, correcting deviations
4.       Continuously adjust, learn, adapt
5.       Develop a quality assurance process
6.       Those responsible for a process are the best source of knowledge about the process
7.       Those responsible and with experience are the best to monitor, correct, enhance, and improve
8.       To reap tangible gains from QA, adequate time is needed for observation, learning, correction, and improvements
9.       Responsibilities can go between and cut across organizational lines - sometimes difficult to determine responsibilities, need to have a plan to address
10.   Management support can make the difference between success and failure

Tuesday, June 3, 2014

High Level Questions for Assessment of Existing QA Process


  1. When was the last time QA process was reviewed?
  2. Are surveys conducted for obtaining feedback regarding the efficacy of the QA process?
  3. Who are the stakeholders?
  4. Who is ultimately responsible for the overall QA process?
  5. What are the respective roles and responsibilities in the current QA process?
  6. What are the gaps that are not addressed by the current QA process?
  7. Can costs/benefits be quantified for the current process?  If so, what are they?
  8. what is the wish list for the QA process?
  9. What are unfulfilled needs that could be fulfilled by an improved QA process?
  10. How is the efficiency and effectiveness currently measured for the existing QA process?
  11. What are the measurements/signals which indicate the QA process is performing as expected?


Thursday, May 29, 2014

What is Quality Assurance (QA)?

Quality Assurance (QA) is an over-arching function

  • A QA function serves as a vehicle to provide processes, methods, measurement, adaptation, continuous learning and adjustments
  • QA metrics are the results of systematic observation of recording behavior for the purpose of creating adaptive standards and controls
  • The ultimate purpose is overall betterment of the QA target on a continual basis
  • It is proactive, not reactive
  • It is generally invoked at higher organizational levels other than line functions 
  • Longer timeframes are taken into account with QA in contrast to QC (Quality Control)
  • QC is more of a back end function 
  • The effort and cost of QA can be visualized as on a sliding scale - it is vital to determine tolerance level for risks and unknowns
  • What is "good enough"?
  • QA is rooted in manufacturing disciplines yet its principles can be applied to services and knowledge work
  • It forces scrutiny and examination in a timely manner
  • Quality denotes conformance to specifications, fitness for use or purpose