Let us not be afraid or
suspicious of all the new innovations that come our way for not all are dangerous even if
they appear to be so prima facie . If we should develop phobias and deliberately turn our
backs on certain technologies because of our apprehension that they would do more harm
than good, then one is compelled to wonder what would have happened had fire been never
used ever after being discovered simply because someone got his fingers burnt or were
afraid of its power for almost universal destruction. What would the consequences for
human existence and history of mankind have been if the person who discovered the wheel
carelessly cast the device away because his fertile brain could have found no rational use
for it? The incessant march of scientific discovery and the motivation for discoveries is
derived from the successful implementation of them. These points are made to emphasise
that one must be realistic enough to face the facts and use all the available
opportunities provided by emergent technologies to make this world of ours a better place
to live in. One must not allow oneself to be paranoid enough to develop phobias to such
proportions that forces one to reject an innovation which could jolly well a good one in
the long run simply because it looked dangerous at first sight. Remember fire? Telemedicine is fascinating. It is "hot" and
is definitely going to be (unless it already is) the "in thing" in health care.
Its implementation would be equally challenging. Its accompanying downsides are certainly
not insurmountable, by any stretch of imagination. All it demands, nay merits, is a fair
chance for it to be workable. If the health care industry repudiates it because of its
present inefficiencies, a golden chance of being able to provide the best of available
care anytime anywhere would be lost for a generation or several generations, and that
would be one of the greatest tragedies to befall mankind. [Top]
Analysis & Assessment
Telemedicine is certainly not rocket science and neither a sci-fi
representation of a pipe dream by a bunch of loonies whose ideas have gone
ballistics, certainly not any more.
However, one must not expect the sky when going in for it. Apart from a
cool and rational assessment of the follow-on and abandonment options for a given
organisation, the investors must realise that as with the implementation of any other
revolutionary and innovative technology, implementation of telemedicine too would face
teething problems. There might be equipment failures, which when and if it should at all
happen, though one most sincerely hopes that it does not all too often, it would prove to
be more than a handful. Furthermore, since this technology is still evolving and as no
pre-set standards as such have been defined as yet, telemedicine might necessitate to be
re-invented as time goes by. It is not (manna from heaven), but given patience and time it
could jolly well be.
The road to implementation will be long and arduous. Several pitfalls
would have to be sensitively and carefully negotiated, and critical choices made after
very careful considerations. Many perilous roads would appear seemingly out of nowhere and
must be negotiated with extreme prudence and dexterity. The inevitable pot-holes and rough
roads would have to be deftly manoeuvred or smoothened out and when and wherever
necessary.
Already a number of very serious efforts are being made for making
telemedicine a reality. Several problems have been identified by various project leaders.
It is very meaningful to run these pilot projects, most importantly to identify the areas
of application, problems, acceptability, usefulness and identification of the possible
remedies thereof. However it is also equally important, as these projects are being run,
to develop products that can be used in real life instead of the controllable environment
of a pilot project. Sponsors have to come forward and in a very big way.
Furthermore, technology per se has this inherently annoying ability to
awry on a lark (remember those famous Murphy's laws?), and when they do, they are real
killjoys. No connection can always and forever be true. During peak hours one must have
noticed how many times the telephone call does not get through. Those of us who have used
the Internet at all sorts of hours have faced the omnipresent problem of delays and loss
of connection due to being "timed out". Even the various 24 hour news channels
with all their correspondents and analysts world -over who cover their stories through
telephone and satellite connections and gizmos of all shapes, sizes and hues have faced
the problems of loss of the communications link and having to break off while bringing on
a "live" event mid-way. Communication links during telemedicine consultations
may also suffer such ignominies from time to time. But when it works, it is wonderful,
magical, and quite simply right out-of-this-world.
Telemedicine is still largely experimental though this situation is
expected to change very soon with the technology moving out of the desktops of the
researchers/developers on to the desktops of the doctors and his patients. A number of
very serious efforts at developing and implementing various telemedicine projects is on
right now in various parts of the world. Such projects are demonstrating the feasibility
of the different formulations that may be offered to the users and help in identifying the
various problem areas (both actual and potential). The only thing that is lacking is the
overall shape that telemedicine would ultimately take. One is still not sure whether it is
to be used only for referral purposes or for communications between the care providers or
for the continuous monitoring of patients or for the development of CDSS or for the
maintenance of virtual patient records or for providing CME to medical professionals or
for a combination of these all. Fortunately, sooner rather than later, we shall see
telemedicine in our homes, affecting our day-to-day lives, and one which will prove to be
indispensable by all involved.
Telemedicine has had a very long gestation period and has been born only
recently. As it passes through its childhood, it will be a problem child. The question
mark of the BCG matrix. What will its future be? It currently has a low market share with
an expected high technology growth rate. It is a cash sink at present. It will cause large
negative DFCFs right now. Once it can be transformed into a star with high market
share and growth rate of the technology, it will definitely produce large positive DFCFs
as its overall overheads would be low. It will become a cash cow before its technology
growth rate falls. Of this I am fairly confident.
The computer is now being used for all sorts of purposes under the sun.
Fax, exchange of information of all sorts of purposes, cyber-shopping using cyber or
e-cash, trading in shares and other commercial commodities, e-mailing (both textual and
visual). You name it, it is being done. One is then forced to ask, why not use the same
medium to deliver quality health care at anytime from anywhere? It can and must be done.
In conclusion, all I can say with the maximal of all convictions that I
can muster is this - telemedicine is the future of medical sciences. Surely you
agree. Could and should we perhaps better name it as e-medicine or e-health care or
e-care, or even, e-medicine (in consonance with e-mail, e-biz, e-cash, etc.)? [Top]
Ideal Software
Solution Since hardly any software solely dedicated for
telemedicine purposes is available in the market world-wide,{though , though a number
programmes are there which if seamlessly interconnected then could be made into a fully
functional telemedicine product, I am enumerating the ideal components and the
capabilities of such software(s) in this section. This would not only help the software
manufacturers in developing quality products that are not wanting in quantity, but also
help health care managers, administrators, or the individual doctors to procure products
that meet the various requirements and link them together with the idea of building their
own telemedicine systems.
The ideal software solution for telemedicine would consist of the
following components, with their capabilities on an individual basis, are as ut infra:
- RDBMS database back-end built with any one of the following: Oracle, MS
SQL Server, Powerbuilder, Sybase SQL anywhere, MySQL. PostgreSQL, MS FoxPro, MS Access,
etc. That is, any back-end that obeys SQL instructions and allows ODBC seamlessly (one can
use the jdbc-odbc bridge if one is building is the user interface in Java). The database
structure should allow for the data to be stored in an atomic manner, the database tables
are multi-dimensional in their structure, the level that the various users work is on
views rather than on actual tables - this helps in fine-tuning the performance and
minimises chances of data corruption.
- The database is physically located on a local server with mirror sites
located all round the globe. The server is connectable by way of extranets to the outside
world. There are suitable firewalls built to guard against hostile data access. So in
essence there will be an intranet at an organisational level (represented by the
chamber/clinic/hospital/institution etc.) which is accessible to the outside world also.
- The front-end is ergonomically built with any suitable programming
language/interface but must be able to process information with the help of Active Server
pages (ASP) or PHP or CGI-Perl or Java Server Pages (JSP) to display data and graphics on
screen, generate and print reports and labels on the fly, automatically format or generate
and then display HTML-based web pages that are enabled with suitable
Java/JavaScript/ActiveX controls and links to datas and other related information which
could be web pages of previous medical details, etc. The data may be handled using XML. If
WAP is to be used then WML is the key. Payment, charges, insurance and other financial
details are also available.
- In my opinion, one (or a combination thereof) of the codes , viz., Read
Codes, ICD 9, CPT, LOINC should be used to store/forward data thereby ensuring data
integrity. Transmissions should follow the recommendations of the Health Level 7 system.
- The doctor should be able to directly store the data after verifying the
details, therefore he should be able to edit them. The package must allow for
intelligently querying of the database. Automatic generation of summary, tools for
analysis of data (textual as well as graphical), capability to download and upload data
from and to floppy disks, zip drives, optical disks, communication ports, etc.
- The database tables on the local server is updated at a time when the
least data-access work is being processed by the server - this may be automatically
determined by the server. The central data warehouse may update the database tables from
data extracted from the views located on the local server. The servers would then not have
any database tables physically located on them. This would lessen data corruption. The
central data warehouse would automatically generate metadata and suitably store them on
its own at a time that is suitable to it. The database of the central data warehouse
should also store the summary web page(s). These in combination would help in speedy and
efficient data-access by other servers that are remotely connected to it via central data
warehouses located at other places.
- The front-end must allows access to word processing and spreadsheet
packages. These may be in the form of built-in connectivity or be provided as a part of
the software package itself. Most, if not all, doctors need to use these packages in some
form or the other. Additionally, there must be connectivity links for getting on to the
Web by clicking on a button, and hypertext and other links to gain access to sites
providing CME.
- CDSS packages for intelligently interpreting datas related to
health. This may be either in the form of a full blown expert system or as an aid for data
interpretation, e.g., sound analyser packages with filters which help in magnifying a
selected portion of sound, picture analyser packages which help in re-constructing
appearance on three dimension (3D) from scanned MRI/CT/USG images which are essentially
only two dimensional (2D), allow for repeated zooming in on a selected portion of a
pictomicrograph to help in predicting/visualising on as great a detail as is necessary,
etc. Such packages may be built using C++ or Java Swings. Naturally, this would mean than
they will have to be installed on the client machines rather than be delivered on-demand
through the Net.
- A suitable data warehousing package that would help in the efficient
administration of the telemedicine system. Software manufacturers could help by providing
data warehousing capabilities to their client-server software itself, even if only a
limited extent like creation of metadata, the physical storage of data in
multi-dimensional form, SQL based querying. However, the package must also be capable to
automatically access the local central data warehouse and upload the data and snapshots of
summary reports at a specified time.
- Information kiosks - the software required for these would need some
special handling. Touch-screen sensitive, smart card and/or finger-print reader, Internet
connectivity are the absolute necessities. Multimedia capabilities would be an excellent,
though optional extra. The information kiosk will only be a watered-down version of the
client side of the client-server telemedicine system, i.e., not all capabilities that are
typically available on a client is present. The things that would normally would not be
present would be the processing or any type of storing, even of a temporary nature as
within RAM, capabilities since these information kiosks would normally not have any such
facilities. This normally means that these functions would be performed at the server
level and therefore such information kiosks would need to be connected to a server located
nearby.
- Transmission of data - I prefer the object mode of data handling for then
incompatibility amongst various softwares installed on to the computers of the end users
is minimised.
- Automatic data backup, recovery from errors in power supply,
comprehensive administrative protocols and tools to maintain the overall health and
operability of the system.
There is another area that demands serious consideration that I wish to
bring to the notice of the interested software developers. They are of course very much
aware of these problems. It is the problems dealing with transmission. These are of a
myriad of natures. Firstly there are breaks in transmission, delayed or non-connection
between networks, connections being "timed out" and hence lost, long time
periods required to exchange large sized datas like videos, pictures, and sounds (i.e.,
multimedia files), - sometimes multimedia files might even take up to an hour of
uninterrupted connection that can handle connection speeds of around (33.6 KBPS) to
transmit. I would like to add here that the various innovators are already addressing this
issue in all its seriousness that it deserves and the transmission problems are being
actively sorted out. [Top]
Transmission Sequence
The sequence of transmission of messages in telemedicine could take any
one of the following processes. The exact process that would precisely be utilised depends
on the software manufacturer. Ultimately it will be both the transmission time lag and
ease-of-use factors that would determine which method turns out to be the winner. The
processes are as ut infra.
- Straight-forward transmission of HTML or ASP or PHP or CGI-Perl or JSP
based web pages which can be easily viewed with the help of the commonly used web
browsers. These web pages would indubitably be having JavaScript, Java applets, ActiveX
controls and the browsers would need to be suitably enabled to display them. The
advantages of this method is that all that one needs is to possess pre-formatted web pages
into which the relevant messages may be inserted. This would help in delivering messages
in a standardised format thereby allowing for not only ease-of-use but also
ease-of-learning. The web browsers that the users would be using to view the pages will
already be installed in their respective machines and hence no added products would need
to be bought by them. Thus, there is expected to be wider acceptance of such a product.
The transmission lag time should not be significantly different from that the end-user
already faces in using the Internet and hence end-user should not perceive any significant
difference in using this product and any other Internet-related product. The disadvantages
are that such a product would be a very simple one and any smart programmer with sound
knowledge of the languages involved would easily be able to duplicate it. Hence the
manufacturer who comes out with the product first will lose his market-share in no time.
- Specialised package which uses a back-end database with a front-end web
page-like (similar in its design as in § 1 ut supra). The advantages of this method, over
and above those for § 1, are that instead of pre-formatted templates, the proper mode of
display may be ensured. This improves the procedures for standardisation and brings about
congruity in message exchange. The ability to use SQL will dramatically improve the
overall capability of the package. Data will be more secure as only the correct front-end,
displayed as a web page on the users screen, will be able be able to correctly
display the information in the right format. If the data is transmitted as objects instead
of raw data then it automatically becomes more secure and the transmission time much
reduced. Both of these necessarily improves performance. This also helps the software
company to protect its product as the customer will have to buy the particular software
only from them. Another company may simulate or develop their own package on similar lines
but unless they can somehow manage to get into their clutches the actual field formats of
the database, they will not be able to sell the very same product. The disadvantage is
that any software company who brings the product into the market will have to go in for
hard-sell and pour a lot of money into their marketing efforts to convince the respective
end-users that their product is the superior-buy and provides good value for money.
A tough proposition indeed!
- Similar in concept as in § 2 ut supra but the data being transmitted
would be in the form of codes (e.g., Read Codes) which are transmitted as objects and are
displayed in the customised web page of the end-user. These codes would be read from the
centralised database and inserted in to their proper places automatically. This will
further improve overall performance. The disadvantage is that unless the software company
can get the rights from the British Crown to use these codes (and I am sure that the
annual royalty will be quite substantial) they cannot be used. One may develop similar
codes on similar lines but that would take a lot of effort and might even be a fit case
for infringement of copyrights. The greatest payoff of using such codes instead of actual
data is that not only does it make it more difficult to decode, particularly if they are
encrypted during transmission, but also one needs to have compatible software which can
accurately read them too but also it allows for the various end-users to interpret the
codes in their own language. Thus, the details of a Japanese patient who was on holiday in
the middle east may be read by a Norwegian doctor without much ado. All of the persons
involved can read all of the information in their local language and script!
A hub-and-spoke concept is my method of choice for telemedicine
networking. This would help in providing all the facilities that it promises. This concept
was originally developed for air services so that they could service a large area with the
help of small regional/local air services for short haul flights while the large distance
flight routes were serviced by larger companies.
A primary hub at the local/zip level is connected to secondary hubs at
the district/county/regional level by LAN method and these secondary hubs are in turn
connected to tertiary hubs at zonal/state/country level which serve large areas by WAN
method. These tertiary hubs are again connected to other tertiary hubs so that a local hub
at one end can interact with another easily.
Building up dedicated hubs solely for telemedicine would be a quite
costly affair and would in all probability require proper bureaucratic authorisation. All
of which mean that there will be delays and difficulties at the level of implementation.
The Internet is an entity that is already there and one can jolly well use it. The various
security concerns and methods of overcoming them have been addressed in a separate
section.
The client-server technology is the method of choice for building up the
basic infrastructure. The various end-users will be the "clients", while the
data is stored in the "servers". These servers will be connected to the Internet
with suitably safe firewalls installed to restrict access to only the authorised
end-users. Since these servers would be interconnected, realistically any end-user from
anywhere should be able to access it.
The recommendations as per Health Level 7 (HL7)
standards should be followed in all data transmissions as this has a very sound basis for
this very purpose. [Top] |
|