Posted: January 29th, 2012 | Author: Andy Reiman | Filed under: Scanner Review | No Comments »
The Scan Man is generally not a fan of mobile personal scanning devices unless they have some value over simply using a smart phone camera to scan. The Doxie Go is interesting. Not that it is fast, but because it is wireless and battery powered. While the mobile phone is great for scanning single page documents, anything with more pages is a pain. The Doxie scanns several pages a minute and then transmits to your computer or mobile device. Check out this little scanner here.
Doxie Go makes scanning smart and simple. Just charge it up and turn it on, anywhere – insert your paper, receipts, and photos to scan, archive, and share.
For all your paper
Doxie scans everything from bills and receipts to reports, drawings, recipes, ideas, business cards, photos, and everything else in your home or office.
Tiny and fast
Doxie’s tiny size makes it easy to scan at your desk or on the go, no computer required – just insert your paper. Scan full color pages in just 8 seconds.
Built-in battery and memory
Doxie’s rechargeable battery keeps you scanning anywhere. Scan up to 600 pages (2400 photos) with built-in memory, or insert an SD card or USB flash drive for additional storage.
Sync to all your devices
Doxie Go syncs to your computer when you plug it in via USB – just like a digital camera. And with available Wi-Fi
, you can also sync scans wirelessly to your iPhone or iPad.
Removable flash storage
Doxie Go has built-in ports for USB flash drives and SD cards. Scan to Doxie’s internal memory, or insert a flash drive, SD card, or wireless SD card
for easy expansion.
Doxie software included
Amazing scanners deserve amazing software
. Doxie 2.0 syncs scans, creates searchable PDFs, creates multi-page stacks, and sends to local and cloud apps.
OCR + Searchable PDFs
Award-winning ABBYY® OCR technology recognizes the text in your documents, creating searchable PDFs. Search all your text, then copy and paste with ease.
Crisp, clean scans
Doxie offers crisp, clean copies of your paper in full color at up to 600 dpi. Automatic contrast, rotation, and cropping make every scan look amazing.
Posted: January 23rd, 2012 | Author: Andy Reiman | Filed under: Scanner Review, Scanning Software | No Comments »
Kodak released a letter to there clients regarding the Document Imaging division and the company restructuring. Download the PDF.
Posted: January 19th, 2012 | Author: Andy Reiman | Filed under: Thoughts | Tags: kodak | No Comments »
The Scan Man is a long time supporter or Kodak’s Document Imaging Division, as both a user and a certified reseller. It is a very sad story, but one that does not necessarily have a bad ending. You can read the gory details below, however it is my belief that the people and products that are part of the document scanning unit will continue on and grow. I have spoken with a number of people in Rochester off the record. The document imaging unit produces outstanding scanners and software and was a bright spot in an otherwise dark place. There is widespread belief that the document imaging products will live on with Kodak as a restructured entity or as an acquired entity within another manufacturer.
The Scan Man predicts a strong future ahead for the document imaging division once the current situation is resolved.
From Reuters: Eastman Kodak Co, which invented the hand-held camera and helped bring the world the first pictures from the moon, has filed for bankruptcy protection, capping a prolonged plunge for one of America’s best-known companies.
The more than 130-year-old photographic film pioneer, which had tried to restructure to become a seller of consumer products like cameras, said it had also obtained a $950 million, 18-month credit facility from Citigroup to keep it going.
The loan and bankruptcy protection from U.S. trade creditors may give Kodak the time it needs to find buyers for some of its 1,100 digital patents, the key to its remaining value, and to reshape its business while continuing to pay its 17,000 workers.
“The board of directors and the entire senior management team unanimously believe that this is a necessary step and the right thing to do for the future of Kodak,” Chairman and Chief Executive Antonio Perez said in a statement.
“Now we must complete the transformation by further addressing our cost structure and effectively monetizing non-core intellectual-property assets. We look forward to working with our stakeholders to emerge a lean, world-class, digital imaging and materials science company,” he added.
At end September, the group had total assets of $5.1 billion and liabilities of $6.75 billion.
Kodak said it and its U.S. subsidiaries had filed for Chapter 11 business reorganization in the U.S. Bankruptcy Court for the Southern District of New York. Non-U.S. subsidiaries were not covered by the filing and would continue to honor all obligations to their suppliers, it added.
A FALLEN ICON
Kodak once dominated its industry and its film was the subject of a popular Paul Simon song, but it failed to embrace more modern technologies quickly enough, such as the digital camera — ironically, a product it even invented.
Its downfall hit its Rust Belt hometown of Rochester, New York, with employment there falling to about 7,000 from more than 60,000 in Kodak’s heyday.
Its market value has sunk to below $150 million from $31 billion 15 years ago.
In recent years, Chief Executive Perez has steered Kodak’s focus more toward consumer and commercial printers.
But that failed to restore annual profitability, something Kodak has not seen since 2007, or arrest a cash drain that has made it difficult for Kodak to meet its substantial pension and other benefits obligations to its workers and retirees.
Perez said bankruptcy protection would enable Kodak to continue to work to maximize the value of its technology assets, such as digital-imaging patents it says are used in virtually every modern digital camera, smartphone and tablet. The company has also built up patented printing technology.
Kodak said it was being advised by investment bank Lazard Ltd, which has been helping Kodak look for a buyer for its digital patents.
Other advisers included business-turnaround specialist FTI Consulting Inc, whose vice chairman, Dominic DiNapoli, would serve as chief restructuring officer for Kodak, supporting existing management.
In the last few years, Kodak has used extensive litigation with rivals such as Apple Inc, BlackBerry maker Research in Motion Ltd and Taiwan’s HTC Corp over those patents as a means to try to generate revenue. Those patents may now be sold through the bankruptcy process.
WALKING ON THE MOON
George Eastman, a high school dropout from upstate New York, founded Kodak in 1880, and began to make photographic plates. To get his business going, he splurged on a second-hand engine for $125.
Within eight years, the Kodak name had been trademarked, and the company had introduced the hand-held camera as well as roll-up film, where it became the dominant producer.
Eastman also introduced the “Wage Dividend” in which the company would pay bonuses to employees based on results.
Nearly a century after Kodak’s founding, the astronaut Neil Armstrong used a Kodak camera the size of a shoebox to take pictures as he became in 1969 the first man to walk on the moon.
Those pictures arguably had more viewers than the 80 films that have won Best Picture Oscars and were shot on Kodak film.
Six years after Armstrong’s walk, and not long after Simon told his mama not to take his Kodachrome away, Kodak invented the digital camera.
The size of a toaster, it was too big for the pockets of amateur photographers, whose pockets now are stuffed with digital offerings from the likes of Canon, Casio and Nikon.
But rather than develop the digital camera, Kodak put it on the back-burner and spent years watching rivals take market share that it would never reclaim.
In 1994, Kodak spun off a chemicals business, Eastman Chemical Co, which proved to be more successful.
Kodak’s final downfall in the eyes of investors began in September when it unexpectedly withdrew $160 million from a credit line, raising worries of a cash shortage. It ended September with $862 million of cash.
PENSIONS IN FOCUS
In its bankruptcy, Kodak could try to restructure its debts, or perhaps sell all or some of its assets, including the patent portfolio and various businesses.
It is unclear how Kodak will address its pension obligations, many of which were built up decades ago when U.S. manufacturers offered more generous retirement and medical benefits than they do now. Many retirees hail from Britain where Kodak has been manufacturing since 1891.
The company had promised to inject $800 million over the next decade into its UK pension fund. It now remains unclear how that country’s pension regulator might seek to preserve some or all of the company’s obligations to British pensioners.
(Reporting by Nick Brown, Caroline Humer and Jonathan Stempel; Editing by Mark Bendeich)
Posted: January 17th, 2012 | Author: Andy Reiman | Filed under: Disaster, Document Management, security | Tags: botnets, document management, InformationWeek, security, worms | No Comments »
The Scan Man caught this article, published by Information Week, about the top 10 security problems. My take away for our clients is: Encrypt everything! If you are going to suffer a breach, don’t let the crooks just walk away with your data. From the pages of Information Week:
From cyber espionage to Android malware, expect to see a greater variety and quantity of attacks than ever before.
As 2012 gets underway, what can businesses expect on the information security front?
If 2011 was any indication, this year will be anything but quiet. Last year featured seemingly nonstop waves of hacking, malware, and spear-phishing attacks that succeeded in exploiting well-known businesses, including RSA and Sony. All told, businesses’ collective data breachesexposed millions of records.
Expect 2012 to offer more of the same and then some. In particular, keep an eye on these 10 top information security trends:
1. Breaches now inevitable, say businesses. Over the past few years, there’s been a notable change in information security rhetoric: Instead of preventing all attacks from succeeding, many CIOs now acknowledge that getting hacked is a question of when, not if. The chief culprit is the sheer volume of attacks being launched, which makes the chance that one of them will succeed nearly inevitable. According to the “2011 Data Breach Investigations Report” from Verizon, for example, the number of attacks launched online against businesses between 2005 and 2010 increased by a factor of five.
The new mandate, then, is not just to maintain killer defenses, but also to have the right technology and practices in place to quickly detect when the business has been breached, and then to block the attack and ideally identify how the breach occurred and what might have been stolen. “We frequently see organizations with protective measures based on the assumption that they are not a target,” said Alan Brill, senior managing director of the cyber security and information assurance division at Kroll, in a recent report. “Yet 2011 taught us that no one is exempt from attack.”
2. Cyber espionage continues. If there is one guarantee for 2012, it’s that industrial or cyber espionage–often executed via “low and slow” and thus difficult-to-detect exploits–will continue unabated. Such attacks were too effective in 2011 for attackers to not continue their press, especially because the social engineering techniques often employed in exploits are incredibly easy to tap and reuse. For example, “it is estimated that the attack which hit RSA was actually used against over 700 other companies,” said Harry Sverdlove, CTO of Bit9, in a recent report. Likewise, the Nitro attack against chemical and defense companies hit at least 48 businesses, Shady RAT hit at least 70 businesses, and Operation Night Dragon exploited multiple energy companies. Although China often gets the blame for such attacks, arguably every major country–allies or otherwise–practices cyber-espionage.
3. Mobile malware continues to increase. For countless years running, pundits have declared it to be theyear of mobile malware. Here’s the reality: to date, mobile malware has largely targeted the Android operating system, full stop, and it rates as little more than a nuisance. Although mobile malware grabs headlines, it’s not very lucrative for attackers because their number-one target is financial information, and that predominantly resides on people’s desktops and laptops.
Accordingly, attackers’ biggest bang for the buck continues to be attacking Windows systems, largely via operating system and application-level vulnerabilities, as well as third-party plug-ins with known bugs. Even so, expect the ongoing, negative headlines associated with Android smartphone hacking–or “smacking,” as Bit9′s Sverdlove calls it–to drive more manufacturers to create locked-down Android smartphones, which would be a boon for securing business users.
4. Mobile devices get anti-theft protection. If mobile devices aren’t under attack to the extent that PCs are, mobile devices still carry a well-known security risk: they tend to get lost or stolen. That fact alone should be reason enough for businesses to take a more rigorous approach to securing mobile devices, including tracking them when they go missing, and ensuring that remote-wipe capabilities are in place should it be too difficult or expensive to recover the devices. With the “bring your own device to work”–a.k.a. BYOD, or the consumerization of IT–trend in full force, expect to see more organizations attempt to add better security to their employees’ mobile devices, including smartphones.
5. Spear-phishing scourge continues. Fast, cheap, and out of control: spear-phishing attacks continue to plague businesses large and small. Witness EMC’s RSA, which experienced a breach that compromised aspects of its SecurID system, simply because an employee opened a malicious Excel file that exploited a known vulnerability and allowed external attackers to create a beachhead in RSA’s network. RSA, of course, is far from the only business or government agency that’s been exploited by these fake–but real-enough-looking–emails. Unfortunately, stopping such attacks is impossible from a purely technological standpoint. Instead, users must be educated–warned, cajoled, trained–to resist such attacks, but even that is not a foolproof strategy. Accordingly, some spear-phishing attacks will continue to succeed.
6. Social engineering attacks hit social networks. All social-engineering attacks succeed based not on technological sophistication, but rather by fooling users. It costs little to send someone an email that redirects them to a fake PayPal website, which tricks them into entering their actual PayPal username and password, which is then passed to attackers. Accordingly, social engineering attacks aren’t going away. Furthermore, with 800 million people now registered on Facebook, and 175 million on Twitter, expect attackers to spend more time targeting social networks. What do such attacks seek to steal? According to Check Point, the primary impetus behind social engineering attacks is financial gain (51%), followed by accessing proprietary information (46%), gaining a competitive advantage (40%), and revenge (14%).
7. Botnets keep infiltrating businesses. According to Panda Labs, three quarters of all new malware strains seen in 2011 were Trojan applications, able to silently infect PCs and make them function as part of a botnet, while also “phoning home” to attackers with stolen information of interest. Cybercrime toolkits now make it easy for any criminal to generate and distribute malware that has a high degree of success at infecting PCs. Such toolkits’ easy availability and the potential profits on offer–which far exceed the toolkits’ initial purchase or rental cost–means that large-scale malware attacks aimed at exploiting PCs and pressing them into silent service as nodes in a botnet will only continue to increase. Ditto for the evolution of botnet-related ecosystems, which offer everything from “malware infection as a service” to leasing botnets by the hour or for the day for use in attacks or scams.
8. Breach notifications gain greater traction. Today, all 50 states effectively require that businesses notify their customers when their personal information has been potentially exposed. But different notification requirements–for example, for medical records–means that although many breaches might be disclosed to government watchdogs, they might never be fully disclosed publicly. (See the RSA breach.) Might Congress finally pass a law requiring that all data breaches be tracked by a single, centralized agency? That doesn’t seem likely, although some other countries now appear to be pursuing that plan. Germany enacted a federal data-breach notification law in 2010, and other European countries have expressed interest. Meanwhile, Canada is weighing changes to its Personal Information Protection and Electronic Documents Act (PIPEDA) that would make data breach disclosures mandatory for that country’s businesses.
9. Critical infrastructure rhetoric keeps heating up. What do you do if you’re the head of a government agency tasked by Congress with protecting the nation’s critical cyber infrastructure, yet said infrastructure is95% privately owned? You posture, especially where large cyber-security budgets are concerned. Said posturing has been the modus operandi of both legislators and agency heads, notably at the Department of Homeland Security and the Department of Defense. Businesses, meanwhile, don’t seem to have leapt at the chance to let the government tell them how to run their networks. That said, expect industry-led information-sharing agreements to help bridge this gap in 2012, by facilitating freer sharing of threat intelligence information between government agencies and critical infrastructure businesses.
10. Code gets externally reviewed. Attackers often exploit known vulnerabilities in applications, and there are a plethora of such bugs to choose from. Accordingly, this business mandate is clear: Developers must take the time to code cleanly, and eradicate every possible security flaw before the code goes into production. Developers, however, can’t do this on their own. They need top-down support, with everyone from executives to front-line personnel held accountable for code quality, which by the way can be measured. Indeed, both internal development tools and on-demand code-review services can scan code, pinpoint flaws, and recommend fixes. Remediating those bugs, by the way, often takes just a matter of days, and is always less expensive than fixing them after products ship.
Posted: January 11th, 2012 | Author: Andy Reiman | Filed under: Hardware Review, Medical | Tags: ehr, emr, ipad, medical records, mobile technology | No Comments »
The Scan Man loves the iPAD platform for taking medical practices paperless. From Jonathan Krausner’s BEI blog, there are some big challenges. Read on:
In November CIO Magazine published an article entitled “iPad in Healthcare: Not so Fast” that questioned the recent hype surrounding replacing traditional PCs and tablets with iPads in hospitals and physician offices. The article caused quite a stir and prompted Drex DeFord, the SVP and CIO of Seattle Children’s Hospital to write the piece below, which is reprinted with his permission.
By Drex DeFord
SVP & CIO, Seattle Children’s Hospital
Who knew that an article entitled, “iPads in Healthcare: Not So Fast” — including quotes from my trusted CTO, Wes Wright — would cause such a stir. I’ve seen tweets, facebook posts, comments on Linked-In, and rebroadcasts of portions of the article in other articles. Most of the stuff I’ve read has been negative-ish about Wes’ comments.
Then Anthony called and asked me to comment on our “bad experience with iPads.”
Here’s what I think: First, you should read the original article and consider carefully what Wes actually says:
1. Legacy applications often don’t work well on iPads. For the most part, they’re NOT built to run on iPads. So the interface is indeed clunky. The iPad isn’t a mouse, keyboard, and 21-inch monitor, and that’s what many of the original apps are built to use. This is a little bit of a “duh” moment for me, but it needs to be said out-loud, because the iPad isn’t the cure-all solution or the “perfect carry-around device” — at least not yet. There’s a lot of work to be done to make that a reality. I’ll talk more about that later.
2. Docs love iPads. To this one, I have to say, almost all of us love iPads. I love mine. I took one of the first iPads we purchased at Children’s to test capabilities – accessing legacy apps, using Citrix Receiver to get to my VDI desktop, reading email. What I found out, though, was that I loved it for all the non-work reasons most: it was my bank, a decent note-taking device, my yoga instructor, and it let me remote control my DVR via the Internet when I forgot to set up a recording. Once I realized that most of the things I loved the iPad for didn’t really have to do with “work stuff,” I gave it back to the test pool and bought my own personal iPad. Yes, we all love our iPads, but I was quick to realize that legacy apps don’t work well on iPads, as I said, yet. Wes said that out loud. For Apple-loyalists, this was heresy. (for the record, I drive a Macbook Pro too)
3. We should be at least a little worried about iCloud. The new offering from Apple is very cool. But I worry all the time about data-leakage in all forms, from all sources. An important part of my job is to protect patient data. iCloud may be, potentially, another threat to that charge. If you’re not thinking about what data lives on an iPad, and then syncs up to the iCloud, or what data might be vulnerable when an iPad is lost or stolen, maybe it’s time to consider the unpleasant possibilities.
This next part is from me: Since I agree that 1, 2, and 3 are generally true, then I hope you’ll understand our view that there’s a lot of work to be done bridging legacy apps (built for PCs) to the iPad form-factor, and making sure the data sent to, and used on, iPads is secure. There’s a hundred different ways to do this, and all of those require time, planning and resources – in a severely resource-constrained environment.
Specific to our EMR, we’re working with Cerner on how the EHR bridge from legacy to iPad might work for Children’s. You should know that I have some criteria for how this should work, because I don’t really want the patient information to reside on a portable device long-term (see #3), so we have to be thoughtful about our solution.
Building bridges from other legacy apps to the iPad will be challenging, and ultimately somebody has to pay for that bridge to be built. I work at a small Children’s hospital, and I don’t have a large iPad development staff (honestly, I don’t have even a single FTE budgeted for iPad development; I’m betting that’s a common situation). Every legacy vendor working on an iPad bridge for their product is taking a slightly different approach, leaving the customer organization to figure out how to integrate and protect data (but that’s not any different than what we’ve lived with for years).
Here’s a wish: similar to a “Virtual Desktop” infrastructure, I need a “Virtual iPad” infrastructure – something that allows us to run and serve-up those vendor built legacy-to-iPad apps in a secure way. So when the Internet connection to the users’ iPad ends, so does the access to patient-information. No data resides on the physical iPad. Oh, and I’d like one of those for the dozens of versions of Android too … while I’m wishing.
The bottom line is that iPads are great devices. Very cool. I love mine. But it’s not a magic-bullet. It’s doesn’t automatically solve the issues outlined above with its coolness alone. It’s a form factor we should seriously consider integrating into our “information delivery system,” especially given its consumer-based popularity. But adding the iPad to the inventory, and doing it well, isn’t as easy as some have made it seem.
And, in a nutshell, that’s what Wes said.
Cool (and useful) Healthcare Apps for the iPad
MedScape: This is the most comprehensive free medical app on the web for clinical information.Free
Blausen Human Atlas: View 3D medical animations with a cross searchable medical term glossary. This app is designed to help healthcare professionals improve how they communicate with patients and is also a great learning tool. $19.99
Eponyms: Learn all your medical terms and symptoms for diseases with this Eponyms application. Meant for students but a great refresher/resource for any medical professional. Free
Taber’s Medical Dictionary: Take medical reference to the next level with this app. Includes photos, care statements and more than 60,000 terms. $49.95
Outbreaks Near Me: This application is being used regularly by medical professionals to track the H1N1 flu outbreaks. The application was created by researchers at Children’s Hospital in Boston with help from MIT. Free
Note: Most of these apps are also available on the iPhone, iTouch, Android, Blackberry, and the Web, so if you haven’t (or don’t want to!) bought an iPad you don’t have to be left out.
Want more? Here are a couple of lists for your perusal:
Posted: January 10th, 2012 | Author: Andy Reiman | Filed under: Medical, Technology | Tags: Coordinator for Health Information Technology, ehr, Electronic Health Records, Electronic Medical Records, emr, health and human services, HHS, hipaa, HIT | No Comments »
The Scan Man listened to a terrific talk by Dr. Farzad Mostashari / National Coordinator for Health Information Technology. This is his blog about HIT 2011 achievements. Most interesting to me was the clear evidence provided that patients get better and more appropriate care from doctors who utilize electronic medical records systems instead of paper charts. If your doc is still using paper, ask them why.
ONC earned its nickname as the “Office of No Christmas” during the 2009 Holiday season roughly two years ago when we, along with our colleagues at the Centers for Medicare & Medicaid Services (CMS), announced the proposed regulations to govern the Medicare and Medicaid Electronic Health Record Incentive Programs (EHR Incentive programs) established under the American Recovery and Reinvestment Act of 2009 (Recovery Act). CMS’s proposed rule outlined provisions governing the EHR Incentive programs, including defining the central concept of “meaningful use” of EHR technology.
At the same time, ONC issued an interim final regulation that set initial standards, implementation specifications, and certification criteria for EHR technology. In the closing months of 2009, ONC also issued a flurry of funding opportunities to support health information technology adoption, information exchange, and the workforce needed to make this important Recovery Act program succeed.
A year later, by the 2010 holiday season, vendors, newly accredited certification bodies, and a few vanguard providers were gearing up for the official launch of the EHR Incentive programs, which opened for registration on January 3, 2011. What has happened in the 12 months since then?
I would like to highlight ten of this year’s most notable developments in the world of health information technology and ONC.
1. January: Launch of the Medicare and Medicaid EHR Incentive Programs
Over the past 12 months, the concept of Meaningful Use has thoroughly permeated EHR development and implementation. The marketplace of certified products has grown quickly, interest in Meaningful Use among providers and hospitals is sky-high, and the pace of incentive payments has continued to accelerate.
- Products: As of today more than 1,500 EHRs—about 1,000 ambulatory and 500 inpatient EHRs— have been certified by one of the six private-sector Authorized Testing and Certification Bodies selected by ONC, up from 300 certified products at the start of the year. To date, 672 vendors have products certified under the program (60% of those vendors are small businesses with 50 or fewer employees), which is more than a three-fold increase in the number of vendors with certified products at the beginning of the year. This growth fosters competition, innovation, and gives providers more choices than ever before.
- Eligible Professionals and Hospitals: As we conclude the year, participation in the Medicare and Medicaid EHR Incentive Programs is strong and growing at an impressive rate. As of November 30, 2011, 154,362 eligible professionals and 2,868 eligible hospitals have registered with either of the EHR Incentive Programs. According to a recent survey, more than two-thirds of hospital CIOs and CEOs identified achieving Meaningful Use as their top IT priority. More than half of office-based physicians say they intend to apply for the Medicare or Medicaid EHR Incentive Programs.
More than 20,000 eligible professionals and 1,200 hospitals have already received their incentive payments from CMS, totaling $1.8 billion so far, with December shaping up to be the biggest month yet.
2. February: Launch of DIRECT
The Direct Project provides a simple, secure, standards-based way for providers and other participants to send encrypted health information directly to trusted recipients over the internet—a kind of “health email.” During 2011, the Direct Project went from publishing its first set of consensus-approved specifications to testing in pilots , to initial production implementation across vendor and state boundaries.
The Direct Project’s 200+ committed members reached consensus on two key specifications enabling secure directed transport of health information. Thirteen pilot communities across the nation put these specifications into practice, and successfully exercised and validated them. Technology and service vendors began offering production Direct capabilities to statewide health information exchanges, state and federal agencies, and health care professionals, with more than 35 vendors having implemented Direct by the end of 2011. Larger communities using Direct in production started to emerge, with Direct as part of the core strategy of 40 state HIE grantees.
3. March: The National Quality Strategy
In March, HHS released the National Quality Strategy for health improvement, the first effort to create a national framework to help guide local, state, and national efforts to improve the quality care in the United States. The National Quality Strategy recognizes health information technology as critical to improving the quality of care, improving health outcomes, and ultimately reducing the costs. Putting the National Quality Strategy into action, HHS subsequently launched two key initiatives that set specific national targets:
- Partnership for Patients, which is working with a wide variety of private and public stakeholders to make hospital care safer by reducing hospital acquired conditions by 40%, and improving care transitions upon release from the hospital so that readmissions are reduced by 20%.
- Million Hearts campaign, which is a public-private initiative to prevent 1 million heart attacks and strokes over the next five years by improving access to care and increasing adherence with basic preventive medicine.
The evidence shows that, health information technology, along with delivery system improvements, will be a key ingredient to the success of these campaigns and other efforts around the country to improve health outcomes. A study published this September in the New England Journal of Medicine which looked at diabetes care in Cleveland found:
- 51% of the patients being treated by physicians practices using an EHR received care that met all endorsed standards of diabetes care compared to 7% of patients treated by non-EHR practices
- 44% of patients treated by EHR practices met at least four out of five outcome standards for diabetes compared to 16% of patients in paper-based practices with similar outcomes
4. April: Launch of the Standards “Summer Camp”
At the April HIT Standards Committee meeting , Doug Fridsma, Director of the Office of Standards and Interoperability and Acting Chief Science Officer, kicked off the Summer of Standards—an accelerated effort to support the Stage 2 standards and certification requirements for the EHR Incentive Programs.
These activities took place within the Standards and Interoperability Forums. One of the major accomplishments of summer camp was reaching consensus around Consolidated Clinical Document Architecture (CDA): This summer, 150 committed members of the Standards and Interoperability Framework Transitions of Care Initiative—including providers, technology vendors, informaticists, standards institutions, and federal agencies—worked toward consensus on a single standard for transmitting care transitions data. After more than 1,000 balloted issues were resolved, the standard was approved, and subsequently recommended by the HIT Standards Committee for inclusion in the Stage 2 standards and certification requirements for the Medicare and Medicaid Incentive Programs.
For the first time in our country’s history there is a single, broadly-supported electronic data standard for patient care transitions!
5. June: Spurring Health Information Technology Innovation
Announced June 8, 2011, and made possible by the America Competes Act, the Investing in Innovations (i2) Program is the first-of-its-kind government effort to use prizes and challenges to stimulate and accelerate the development of solutions to targeted health care problems. Prizes and challenges are attracting a wide range of innovators from both inside and outside traditional health care communities to address tough problems, spurring industry-wide innovation and rewarding only best-in-class work.
Since June, several i2 challenges have been launched, including:
“One in a Million Hearts” challenge, a multidisciplinary call to innovators and developers to create applications activating and empowering patients to improve their heart health.
“Ensuring Safe Transitions from Hospital to Home” challenge, a collaboration with the Partnership for Patients called on software developers to create easy-to-use applications for consumers and caregivers improving care transitions. Winners include: Axial Transition Suite, ibluebutton, and FlexisVolDSPAN .
6. July: Health Information Technology Workforce
The growth in the number of providers adopting EHR systems, as well as the number of vendors developing EHR products, is positively affecting employment in the health information technology sector. The Bureau of Labor Statistics estimates that within the fast-growing health field, the fastest growth has been among IT-related health workers, which added more than 50,000 jobs from 2008 to 2010 alone.
ONC is helping train the health IT workforce the marketplace is demanding through 82 community colleges and nine universities nationwide. As of October 2011, the participating community colleges have graduated 5,717 health information technology professionals, with 10,065 more students currently in the training pipeline. As of November 2011, the universities have produced more than 500 post-graduate and masters-level health information technology professionals, with more than 1,700 expected to graduate by July 2013.
In July, ONC open-sourced the health information technology curriculum initially implemented and tested in the community college program. We have been amazed by the level of interest from across the country and around the world.
7. September: Breach Reporting and Increasing Security Awareness
In early September, HHS’s Office for Civil Rights (OCR) issued the first report to Congress on breaches of unsecured protected health information. According to the Health Information Technology for Economic and Clinical Health (HITECH) Act, health care providers and their business associates must now notify HHS, affected individuals, and in certain cases the media, about breaches of unsecured protected health information (PHI).
Notifying individuals and officials of security and privacy breaches serves a number of important purposes:
- It helps ensure that appropriate steps are taken to mitigate any adverse consequences of a breach;
- Promotes public transparency regarding such incidents; and
- Encourages providers and their business associates to take action to avoid such incidents.
With this in mind, providers and vendors are increasingly focused on the need to encrypt data, particularly on mobile devices and to regularly conduct risk analyses.
8. September: Consumer eHealth Comes to the Fore
In September, ONC organized a Consumer Health IT Summit that formally launched a Consumer e-Health Program. The Summit initiated a pledge program for public and private sector organizations to support individuals in being partners in their health via information technology.
CMS and CDC issued proposed regulations that would make it easier for patients to access lab data, and developed tools to make it easier for consumers to understand how sensitive information held in personal health records may be used.
More than 250 organizations, such as Aetna, the Mayo Clinic, Microsoft, AARP, and Consumers Union, have agreed to make health information easily available to consumers. The Blue Button, for example, which originated with the Veterans Affairs health system as a way of allowing veterans to download key parts of their medical record, is now being made available to millions of others by numerous public and private healthcare organizations.
Together, these organizations cover approximately 100 million people –equivalent to about a third of the population of the nation.
9. October: Regional Extension Centers Surpass their Goals
Adopting and using an EHR can be a daunting proposition for providers, especially those with limited resources, such as small primary care practices and critical access hospitals. ONC has funded 62Regional Extension Centers (RECs) nationwide and a national Health Information Technology Research Center (HITRC) to help overcome the barriers to adoption and Meaningful Use.
We have also launched healthit.gov, a web-based information resource that serves as the virtual 63rdREC. The REC program offers providers training, information, and technical assistance to accelerate the adoption of certified EHR systems and the achievement of meaningful use. As of mid-December, the REC program had enrolled over 116,000 priority primary care providers, well exceeding its goal to enroll 100,000 priority primary care providers by the end of 2011. What’s more, the majority of primary care providers in rural areas are enrolled with an REC –70% of rural primary care providers in small practices are receiving REC assistance to make the transition to EHRs and meaningful use.
10. November: Growth in the Adoption of EHRs
Data from the CDC’s gold-standard survey of office-based physicians released in November showed that the percentage of non-hospital based physicians who have adopted a basic EHR has doubled from 17% in 2008 to 34% in 2011. Nearly 40% of primary care physicians have adopted an EHR.
The story is similar for hospitals. Prior to passage of the HITECH Act, only 10% of hospitals had adopted basic EHRs, and only 2% of hospitals were believed to have implemented most of the functionality called for in the Medicare and Medicaid Incentive Programs. In the most recent survey of hospitals, 41% of hospitals eligible for the EHR Incentive Programs had adopted certified systems, with an even higher percentage among larger hospitals and academic medical systems.
The growth in EHR adoption has been accompanied by a sharp increase in the number of providers using EHRs to electronically prescribe. Data show that 42% of non-hospital based physicians are submitting electronic prescriptions through an EHR system, more than a five-fold increase since 2000. In addition, a startling 93% of pharmacies are now capable of receiving electronic prescriptions.
What an amazing year it’s been for health IT and all of us involved in this shared effort and collaborative partnerships.
We’ve navigated through the tumult of these changes by focusing on our key principles:
- Open and inclusive decision-making;
- Focusing on the goal, not the technology;
- Building on what works best;
- Fostering innovation using the market;
- Watching out for the least capable participants; and above all,
- Putting patients’ interests in the center of everything we do.
I’m looking forward to 2012, and I will be blogging next about the five big health IT trends I see for the year to come. Send me your ideas by using the comment section below!