SciELO - Scientific Electronic Library Online

 
vol.111 issue2HIV and breast cancer - a mammographic analysis: An observational study to identify the mammographic pattern of breast cancer in HIV-positive patientsPoint-of-care CD4+ technology implementation in Free State, South Africa, was associated with improved patient health outcomes author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

    Related links

    • On index processCited by Google
    • On index processSimilars in Google

    Share


    SAMJ: South African Medical Journal

    On-line version ISSN 2078-5135Print version ISSN 0256-9574

    SAMJ, S. Afr. med. j. vol.111 n.2 Pretoria Feb. 2021

    https://doi.org/10.7196/samj.2021.v111i2.14686 

    RESEARCH

     

    Examining the reliability of ICD-10 discharge coding in Red Cross War Memorial Children's Hospital administrative database

     

     

    A DanielsI; R MuloiwaII; L MyerIII; H BuysIV

    IFCPaed (SA), MMed (Paediatrics); Department of Paediatrics and Child Health, Faculty of Health Sciences, University of Cape Town, and Red Cross War Memorial Childrens Hospital, Cape Town, South Africa
    IIFCPaed (SA), MSc (LSHTM); Department of Paediatrics and Child Health, Faculty of Health Sciences, University of Cape Town, and Red Cross War Memorial Childrens Hospital, Cape Town, South Africa
    IIIMD, PhD: Division of Epidemiology and Biostatistics, School of Public Health and Family Medicine, Faculty of Health Sciences, University of Cape Town South Africa
    IVFCPaed (SA), MRCP (UK), MSc Paed (SA); Department of Paediatrics and Child Health, Faculty of Health Sciences, University of Cape Town, and Red Cross War Memorial Childrens Hospital, Cape Town, South Africa

    Correspondence

     

     


    ABSTRACT

    BACKGROUND: Discharge diagnostic data from hospital administrative databases are often used to inform decisions relating to a variety of vital applications. These may include the allocation of resources, quality-of-care assessments, clinical research and formulation of healthcare policy. Accurately coded and reliably captured patient discharge data are of paramount importance for any hospital and health system to function efficiently
    OBJECTIVES: To retrospectively examine the reliability of the International Classification of Diseases version 10 (ICD-10) discharge coding in Red Cross War Memorial Childrens Hospital (RCWMCH)'s administrative database for primary and secondary discharge diagnoses, and to formulate recommendations for improvement to the current system.
    METHODS: This study was a retrospective folder review of 450 patient admissions to the short-stay and general paediatric wards at RCWMCH between 1 August 2013 and 1 September 2014. The principal investigator (PI) completed ICD-10 discharge coding for each admission and compared it with the corresponding admission data captured for each patient in the Clinicom (Siemens Medical Solutions, Germany) health information system. Agreement comparison was done to 4- and 3-character ICD-10 code specificity.
    RESULTS: Of the initial 450 randomly selected folders, 396 (88%) were analysed during the folder review process. The median number of total diagnoses (primary diagnosis plus secondary diagnoses) coded by the PI folder review was 3, with a distribution of 1 -10 (interquartile range (IQR) 2 - 4). The median number of total diagnoses coded in Clinicom was 1, with a distribution of 1 - 3 (IQR 1 - 1). Agreement of primary diagnosis coding to 4 characters was 26.3%, with slight improvement to 34.3% when assessed to 3 characters. Agreement of secondary diagnoses to 4 characters was 14.9%, and 27.7% when assessed to 3 characters.
    CONCLUSIONS: Reliability of administrative ICD-10 discharge data from RCWMCH is poor. Inadequacies regarding the employment of dedicated and/or adequately trained coding personnel may significantly contribute to the problem and should be addressed.


     

     

    Administrative discharge data contain vast amounts of patient information and have enormous potential for use in a variety of applications beyond simply classifying morbidity, mortality and procedures for statistical purposes. These may include: applying the data for use in hospital reimbursement protocols; allocation of resources; outcomes monitoring; quality-of-care assessment; and clinical, epidemiological and health services research.[1] Harnessing the potential of this wealth of information is extremely attractive, given the cost-saving benefits over other forms of data collection; this is especially true for resource-limited settings.

    Diagnoses and other clinical information obtained during the course of hospital admissions are recorded in various data sources, including the patient medical record and electronic administrative database. At discharge, diagnostic coding is recorded for all conditions that affected patients during their admission or episode of care. This should include a single 'main or primary diagnosis' and all the 'secondary diagnoses', if present.[2] In most settings, discharge diagnoses are coded using the International Classification of Diseases version 10 (ICD-10),[3] the most widely used classification of diseases.

    ICD-10 allows for a very specific degree of diagnostic coding, with up to 5 characters to make up an alphanumeric diagnostic code. The first letter refers to the chapter in which the code is contained and the subsequent 2, 3 or 4 numbers refer to a related group of diseases, and then a specific disease within that group. The more characters included, the more specific the diagnostic code is for the condition (e.g. A03 (3 characters) for Shigellosis v. A03.1 (4 characters) for Shigellosis due to S. flexneri). Coding to the maximum level of specificity is not always possible, as the appropriate diagnostic information may not be available/documented for each case. However, ICD-10 guidelines dictate that diagnoses should be coded to the highest possible level of specificity, as this will dramatically improve the quality and usefulness of the derived data.[4] A systematic review of coding in the UK noted generally good accuracy up to the third character level of the ICD-10 diagnostic code, with a significant drop thereafter, suggesting that most errors occur from the fourth character level of specificity.[5] Diagnostic coding at Red Cross War Memorial Childrens Hospital (RCWMCH) is done using ICD-10 to a maximum of 4 characters.

    Previous studies report that collected patient data, such as discharge diagnoses, are plagued with inaccuracies as a result of numerous errors - from admission to discharge.[1,6-8] Studies assessing agreement between medical chart and administrative databases for specific diseases routinely report poor correlation and advise caution when using these databases in isolation for research purposes. Most of the published research originates in the developed world and many of the articles have reported poor reliability in administrative discharge data[6,7] This is concerning for poorly resourced state health facilities, such as those in South Africa (SA), where adequately trained and dedicated coders are rarely employed and where few or no published data exist that specifically assess hospital administrative data reliability.

    RCWMCH is a dedicated children's referral hospital in Cape Town, SA, which caters for about 18 500 inpatient and 260 000 outpatient visits per year. Our study examined the reliability of discharge ICD coding captured within the Clinicom (Siemens Medical Solutions, Germany) health information system (HIS) at RCWMCH. Mandatory ICD-10 coding in SA was rolled out nationally in a phased approach, with the final phase of implementation in July 2014.[9] The rationale for implementation was: to standardise data collection processes across the healthcare industry; to allow for systematic recording, analysis and interpretation of morbidity and mortality data to facilitate national and international comparison; and, importantly, to support the development of the National Health Insurance (NHI) whose sustainability will depend on quality health information data and systems.[9] There is also an important revenue and reimbursement aspect, with ICD-10 codes determining tariff levels for fee-paying patients and those who have private medical aid.[9] At RCWMCH, Clinicom uses ICD-10 coding to capture data and generate reports of hospital patient data and events in keeping with the aforementioned national implementation rationale. It enables a monitoring system for patient numbers, burden of disease, outbreak monitoring, length of stay, diagnoses, treatment and procedural information, as well as intensity of care rendered. The data stored in Clinicom can be shared, analysed and used by medical staff, health information managers, policymakers, researchers and employers for current-state analysis and comparison over different epochs, both locally and internationally. Many provincial intersectoral planning meetings utilise the data reports to approach solutions to health-related community problems (e.g. immunisation, pneumonia, diarrhoea and malnutrition). The captured data further allow for financial business units to interrogate cost drivers within the system. The reports feed through to the provincial administration departments and assist with allocation of health budgets. The data further feed into the National Health Information System of SA (NHISSA) to inform central government.

    Unreliable data could significantly affect the quality of the administrative data sets and thus all data and statistics abstracted from these. At RCWMCH, the coding is entered into discharge summaries by untrained junior staff, interns and medical officers, and then captured by ward clerks or their interns, who have no medical background and who often trawl through pages of handwritten records to enter the information into Clinicom. Hereafter, the hospital information management unit generates monthly reports that utilise data from the Clinicom system. The quality control processes at ward level seldom go beyond pointing out correctness of entries, e.g. ICD-10 codes, dates, time, signatures and stamps rather than true data-quality checks. The study aimed to retrospectively examine the reliability of ICD discharge coding in the RCWMCH administrative database for primary and secondary discharge diagnoses, and formulate recommendations for improvement to the current system.

     

    Methods

    This study was a retrospective folder review of patient medical records and data captured in the Clinicom HIS at RCWMCH in Cape Town. This retrospective study reviewed a sample of the admissions over a period of time. The study population included patients admitted and treated in the short-stay and general medical wards at RCWMCH between 1 August 2013 and 31 July 2014 and whose discharge information was captured and recorded in the Clinicom HIS. Patients admitted to short-stay wards are typically admitted for 1 - 2 days and thereafter either discharged home or transferred to lower-level facilities to complete treatment. Patients admitted to general medical wards are those who require further investigation and/or treatment for a longer duration at a tertiary care hospital. Some patients initially admitted to short-stay wards maybe escalated to admission in general medical wards.

    Patients discharged from the rehydration subdivision of the short-stay ward were excluded; their inclusion would unfairly bias towards correct coding of the primary diagnosis in the Clinicom system, as this ward is almost exclusively used for treating diarrhoeal disease, thereby making the primary diagnosis ICD-10 code a given.

    Definitions

    Primary or main diagnosis. 'In South Africa, the "main condition" is defined as the condition, diagnosed at the end of the episode of healthcare, primarily responsible for the patient's need for treatment or investigation. If there is more than one "main condition treated", then the most clinically severe or life-threatening condition should be selected.'[4]

    Secondary diagnosis. Additional conditions that affect patient care or may co-exist with the main condition in terms of requiring any combination of clinical evaluation, therapeutic treatment, diagnostic procedures, extended length of hospital stay, increased nursing care and/or monitoring. This includes any comorbidity that the patient may have. There may be multiple secondary diagnoses per patient.'[4]

    Total diagnoses. The sum of the primary diagnosis and all secondary diagnoses present during a healthcare encounter/hospital admission.

    Diagnostic codes. The alphanumeric codes given for all primary and secondary diagnoses as per ICD-10.

    Hospital information system. A computerised information system designed to help hospitals manage and process all aspects of their daily operations in a more organised, integrated and efficient manner. Clinicom is one of several HIS software packages.

    Administrative data. These data are routinely generated and most often stored within the HIS at every encounter with the healthcare system, e.g. epidemiological/demographic data, a diagnosis, a procedure and an admission to hospital.

    Data collection

    We randomly selected 450 folders from a total of 7 535 discharged patients entered into the Clinicom HIS for the short-stay and general medical wards during the 12-month period between 1 August 2013 and 31 July 2014. We estimated that a sample of 350 would be sufficient to give a precision within 5% of any point estimate up to 50% for agreement/non-agreement. We took a random sample of 450 folders to accommodate the possibility of up to 20% of randomised folders not containing sufficient data for analysis. To randomise our folder selection, we assigned the 7 535 eligible folder numbers from the Clinicom HIS a sequential code (1, 2, 3, 4, etc.) and then randomly selected 450 folders within this code, using an electronic random number generator. The patient folders were reviewed by the principal investigator (PI) and data variables extracted and captured into an electronic spreadsheet.

    The data collected included patient clinical information: primary discharge diagnosis and up to 8 secondary diagnoses. The PI assigned an appropriate ICD-10 code for each folder's primary and secondary diagnoses to 4-character specificity (where possible). ICD-10 coding was completed as per the SA ICD-10 coding standards,[4] using an online version of the World Health Organization (WHO) ICD-10 version (2010),[3] which was the version in use at RCWMCH during the time of the study population admissions. An expert physician (EP) investigator, one of the study investigators, reviewed a randomly selected sample of 20 folders (5%) for quality control and to assess PI-EP agreement, with the EP as the reference standard. The EP followed the same chart abstraction procedure as the PI. The PI and EP were blinded to the reciprocal patients' diagnostic coding and other relevant data recorded in the Clinicom system and also to each others diagnostic coding.

    The data abstracted and ICD coding done by the PI were regarded as the reference/gold standard for the study. The reciprocal data/coding for each folder was abstracted from the Clinicom administrative data set into the spreadsheet to create the second data set for comparison with the PI data set.

    Statistical analysis

    Continuous data were summarised using medians and interquartile ranges (IQRs), while proportions were depicted using percentages and 95% confidence intervals (CIs), as appropriate. For initial comparison we described the number of diagnoses per patient for the Pi-abstracted data and for the administrative database, using conventional descriptive methods (mean and standard deviation (SD) or median (IQR)) or proportions, e.g. total diagnoses, total secondary diagnoses.

    Reliability of Clinicom recording was assessed by calculating proportions of agreement between Clinicom-generated and Pi-generated diagnostic records at both 4- and 3-character levels. We included an assessment to 3 characters, as we believed that disagreement at only the fourth character level may represent a much less clinically significant disagreement and could still provide some useful information for various applications.

    Agreement was calculated similarly for primary diagnoses, secondary diagnoses and for any diagnoses (at least one similar diagnosis, irrespective of whether it was secondary or primary), with Pi-generated diagnostic records as the gold standard. The any diagnosis assessment was used to disregard the ordering of diagnosis and to assess to what degree at least one of the PI total diagnoses was listed among the total diagnoses for each patient in the administrative data set. Agreement here would suggest at least some thread of commonality between the two data sets. For quality control, a similar analysis was carried out comparing EP- and Pi-abstracted data in a small sample, using EP as the reference/gold standard.

    All data were analysed in Stata 13.0 (StataCorp., USA).

    Ethical approval

    Ethical approval was obtained from the Human Research Ethics Committee, University of Cape Town (ref. no. HREC 021/2017) and the RCWMCH administration. The study was conducted in accordance with the Declaration of Helsinki, 2013. No identifying data were used in our password-protected electronic database.

     

    Results

    Of the initial 450 randomly selected folders, 396 (88%) were analysed during the folder review process. Thirty-three (7.3%) were excluded, as the folders could not be located, 12 (2.7%) were excluded owing to missing relevant notes in the folders, 8 (1.8%) were excluded as no ICD discharge diagnosis codes were entered into Clinicom and 1 (0.2%) was excluded, as the patient was admitted to a surgical ward and not to the short-stay or general medical ward.

    In the sample, there were 283 (71%) patients admitted and discharged from the short-stay ward and 113 (29%) from the general medical wards (Fig. 1). The PI marked 28 (7%) folders as 'difficult to code'.

    The first assessment of reliability was done by examining completeness of coding by comparing the total diagnoses coded per discharge in Clinicom with the PI folder review. The median number of total diagnoses (primary diagnosis plus secondary diagnoses) coded by the PI folder review was 3 (range 1 - 10; IQR 2 - 4). The median number of total diagnoses coded in Clinicom was 1 (range 1 - 3; IQR 1 - 1).

    Agreement of primary diagnosis coding to 4 characters was 26.3% and showed slight improvement to 34.3% when assessed to 3 characters (Table 1). Agreement for secondary diagnoses to 4 and 3 characters was 14.9% and 27.7%, respectively. The poor secondary diagnoses agreement was expected, given the undercoding of diagnoses noted from the completeness examination. No significant difference in agreement was observed between the general medical and short-stay wards.

    Agreement for at least 1 similar diagnosis to 4 and 3 characters was 27.5% and 36.4%, respectively (Table 2). No significant difference in agreement was observed between the general medical and short-stay wards.

    The analysis was repeated with the exclusion of cases marked as 'difficult to code' by the PI. The agreement showed little overall improvement and was in some cases even poorer.

    Quality control

    The median number of total diagnoses (primary diagnosis plus secondary diagnoses) coded by the PI folder review was 3, with a distribution of 1 - 10 (IQR 2 - 4). The median number of total diagnoses coded by the EP folder review (n=20) was 3.5, with a distribution of 1 - 10 (IQR 2 - 5). Agreement of primary diagnosis coding to 4 characters was 45% and showed slight improvement to 65% when assessed to 3 characters. Agreement of secondary diagnoses to 4 and 3 characters was 70% and 75%, respectively. Agreement of at least one similar diagnosis to 4 and 3 characters was 95% and 100%, respectively.

     

    Discussion

    Our study demonstrates extremely limited agreement between discharge diagnostic coding abstracted from the medical records and that in the hospital administrative database at RCWMCH. The study highlights two fundamental issues regarding the quality of the administrative discharge data, i.e. the overall undercoding of diagnoses, with limited secondary diagnoses recorded per patient in the administrative data compared with the medical chart review; and the overall poor agreement between the administrative data ICD-10 coding compared with the medical chart review.

    Previous studies have shown the phenomenon of undercoding of diagnoses in administrative data. This has mostly been in relation to undercoding of specific diagnoses, which may be poorly defined or inherently complex to diagnose (e.g. heart failure, sepsis, respiratory distress syndrome).[10-16] Our study has, in contrast, demonstrated striking overall undercoding in the administrative data throughout the entire sample. The median number of total diagnoses coded by the PI folder review was 3, with a distribution of 1 - 10 (IQR 2 - 4). The median number of total diagnoses coded in Clinicom was 1, with a distribution of 1 - 3 (IQR 1 - 1). This illustrates a significant discrepancy in completeness of diagnostic coding between Clinicom and the PI folder review. It is worth noting that the PI-EP quality-control comparison showed a very similar number of total diagnoses with similar distributions, as well as comparable IQRs.

    The second issue that was highlighted was the poor diagnostic code agreement between the PI folder review and the administrative data. Primary diagnosis agreement to 4 characters was only 26.3%, and when limited to 3 characters, it showed only marginal improvement to 34.3%. A previous systematic review of discharge coding accuracy showed a significant improvement (39% in some studies) when agreement analyses were limited to 3 characters, suggesting that a high proportion or errors occur at the fourth character.[5] These numbers are well in excess of the 8% improvement noted in our study. As the systematic review was limited to hospitals in the UK, it serves as another example of the limited generalisability of high-income country (HIC) studies in this context. Further, there are substantial differences in resources available for coding in low- and middle-income countries (LMICs) than in HICs. Even poorer agreement for secondary diagnoses was to be expected, given the undercoding noted in the initial analyses. A more significant improvement was noted when limiting analyses from 4 to 3 characters (14.6% and 26.8%, respectively).

    The best agreement results were noted when assessing for 'at least one similar diagnosis', and even these were still remarkably poor at 28.3% and 37.2% for 4-and 3-character agreement, respectively. This analysis is perhaps the most telling, given that it was poor at both 3- and 4-character assessment, despite it essentially disregarding the ordering between primary and secondary diagnoses, and was therefore the most 'forgiving' of the analyses. Ordering of diagnoses in a population with a high burden of complex medical issues (e.g. HIV, malnutrition and poverty-related illnesses) can be particularly challenging with regard to singling out one of many significant diagnoses as the primary diagnosis. This difficulty in ordering was also possibly at play in the EP-PI quality-control analysis for primary diagnoses, which yielded the lowest agreement of all such analyses. However, when assessing for at least one similar diagnosis, the agreement was near perfect at 95% and 100% for 4 and 3 characters, respectively. There is no clear consensus on what constitutes an acceptable level of agreement for discharge coding reliability in the administrative data. However, of all our quality control results, these were certainly adequate as a benchmark for comparison.

    Finally, when stratifying the various agreement analyses to short-stay and general medical wards, no significant difference was noted in the results, suggesting that the poor coding reliability is likely to be widespread across different wards in the hospital. There are several possible explanations for these results, which may include some of the following:

    lack of dedicated and adequately trained expert coding staff

    inadequate training of current staff (medical and non-medical) responsible for diagnostic coding

    poor medical chart documentation by clinicians

    inherent limitations of the ICD-10 coding system regarding the disconnect between the rigid ICD diagnostic descriptors and local clinical concepts/terminology[17]

    a culture of unimportance attached to discharge coding among busy medical and non-medical staff

    lack of a direct financial incentive for complete and accurate discharge coding in government-funded state hospitals, such as RCWMCH, which is contrary to private healthcare and many HIC healthcare systems, where optimised and accurate discharge coding that is used to calculate billing, equates to significant revenue for hospitals[11]

    lack of regular complete administrative data auditing.

    Many of these reasons were possibly responsible for poor coding reliability to varying degrees, and the lack of adequately trained and dedicated clerical coding staff should be considered as a significant contributor to the poor results. Capturing of the coding at ward and outpatient level is left largely to infrequently trained ward clerks, who do not review the medical notes when coding in Clinicom. This is partly due to staff constraints and, more importantly, many not having a medical background and/or adequate training in diagnostic coding. They rely on the admission sheets, ward admission books and discharge summaries when available, which are often sparse and of variable quality. Furthermore, there are no quality-control measures to assess the reliability of discharge coding in Clinicom.

    Previous studies showing good administrative data validity and consistency have suggested that dedicated expert coders employed in these settings are the primary reason for the good results.[1,18,19] The results also echo those noted in the few other studies from LMICs, which have reported administrative data to be significantly poorer than those reported in HIC studies.[20,21]

    Study limitations

    Our study has several limitations in addition to those imparted by being retrospective in design. Firstly, as a single-site study at a paediatric teaching hospital our sample was not representative of those of all hospitals in the country, and as administrative data quality may vary across hospitals and countries, generalisability of our findings to other settings is limited. However, we believe that, despite this limitation, our study has value in highlighting the phenomenon of poor discharge coding reliability in similar settings and in raising awareness and caution when considering the use of these data for important applications.

    Secondly, we used only one PI, who had no formal training in discharge coding to abstract and code the data from each chart. As part of quality control, an EP abstracted data from a sample of the medical records and the examined PI-EP agreement. Disagreements in quality control were discussed and common pitfalls were considered in the final coding process.

    Finally, confirming the primary source of the error responsible for the poor administrative data reliability is technically difficult in a retrospective folder review; our study did not adequately address this. Studies have shown a clear link between poorly documented medical notes and poor administrative data discharge-record reliability.[1,8]

    Even if error in ICD-10 diagnostic coding was disregarded, the overall administrative data coding was still uniformly sparse and under-coded compared with the PI folder abstraction. The PI-EP comparison showed similar detail and completeness in diagnostic coding, with high median total diagnoses per patient and good overall agreement. Given these points, while medical note documentation may be a contributing factor, it seems unlikely to be the prime source of poor reliability in our study

    All analyses were performed in the medical records department at RCWMCH.

    Recommendations

    The reality of limited resources in our setting dictates that the key to improvement lies in implementing cost-effective measures that collectively have a positive impact on data quality:

    Improvement to discharge summary preparation

    We recommend that discharge summaries be typed to improve legibility and that ICD discharge codes be included with each discharge diagnosis list in the summary. This should help the nonmedical ward clerk staff to enter reliable data at discharge into the HIS.

    Senior staff involvement

    We would also encourage senior staff (e.g. senior registrars, consultants, nurse unit managers) to become more actively involved in supervising discharge summaries, including activities such as regular discharge summary meetings and consultant sign-off of each discharge summary.

    Auditing

    Regular auditing of discharge-coding reliability is recommended to assess baseline reliability and track the impact of interventions.

    Practical health informatics training

    Some consideration should be given to including some elements on health informatics in the curriculum for medical students at medical school and teaching hospitals.[8,20] Providing access to and facilitating and completing the free WHO ICD online training course[22] for junior medical staff, as well as non-medical ward clerk staff, are also recommended. In a recently published local study, this intervention has been shown to significantly improve discharge-coding reliability. [23]

    More costly recommendations include the following:

    Many of the cost-effective recommendations we have offered are available for implementation to some degree at RCWMCH (e.g. typed discharge summaries, senior staff summary checking auditing, electronic continuity of care record (eCCR)). Therefore even within the bounds of limited resources, positive changes towards accurate data collection can be realised if driven by strong senior leadership to initiate a shift in attitude around the importance of ICD coding. While we believe formal training for clinical and clerical staff to be a cost-effective investment in the long term, informal training from on-site experts is also worth considering. A private patient case manager is employed at RCWMCH, who is formally trained in ICD coding, as this is necessary for accurate billing to medical aid schemes. This case manager could serve as an on-site expert for the proposed informal training. This would go a long way toward encouraging understanding and compliance among the staff and could be implemented without excessive resource allocation. As previously mentioned, the use of electronic record keeping has been aided in recent years with the launch of the eCCR system at state hospital facilities. It would be interesting to see its future impact, if any, on discharge-data reliability at RCWMCH.

     

    Conclusions

    Our study demonstrated poor agreement between discharge diagnostic coding in the hospital electronic administrative database and those abstracted directly from medical folders for general medical and short-stay ward admissions at RCWMCH. These results should caution against the use of administrative discharge data as an information resource for any administrative or research purposes. We recommend that further studies be done to re-evaluate reliability after implementing quality-improvement interventions, as well as further research in general across varying healthcare facilities, and with larger samples to help define the overall state of discharge coding in LMICs.

    Declaration. None.

    Acknowledgements. We would like to thank the medical records department at RCWMCH for their patience with folder requests and allowing AD the use of their space. We would like to thank the patients and families who visit RCWMCH for their trust and patience and for allowing us to learn from them.

    Author contributions. AD performed the data collection and was the primary contributor in writing the manuscript, which is an edited version of the thesis submitted for his MMed degree. HB contributed to editing and supervision of manuscript writing, as well as with data collection for the quality-control section. RM was the primary contributor to the statistical analysis of the data and also contributed to the writing of the manuscript. LM contributed to editing and supervision of manuscript writing.

    Funding. The publication of this manuscript was supported in part by the Department of Paediatrics and Child Health Research Committee, University of Cape Town.

    Conflicts of interest. None.

     

    References

    1. O'Maliey KJ, Cook KF, Price MD, Wildes KR, Hurdle JF, Ashton CM. Measuring diagnoses. ICD code accuracy. Health ServRes 2005;40:1620-1639. https://doi.org/10.1111%2Fj.l475-6773.2005.00444.x        [ Links ]

    2. World Health Organization. International Classification of Diseases (ICD), http://www.who.int/classifications/icd/en/ (accessed 7 December 2020).         [ Links ]

    3. World Health Organization. International Statistical Classification of Diseases and Related Health Problems. 10th revision. Vol 2. Instruction Manual. Geneva. WHO, 2011.         [ Links ]

    4. National Department of Health. National Task Team for the Implementation of ICD-10. South African ICD-10 Coding Standards. Pretoria. NDoH, 2012.         [ Links ]

    5. Burns EM, Rigby E, Mamidanna R, et al. Systematic review of discharge coding accuracy. J Publ Health 2012;34(1):138-148. https://doi.org/10.1093/pubmed/fdr054        [ Links ]

    6. Peabody JWL, Luck J, Jain S, Bertenthai D, Glassman P. Assessing the accuracy of administrative data in health information systems. Med Care 2004;42(11):1066-1072. https://doi.org/10.1097/00005650-200411000-00005        [ Links ]

    7. Gorelick MH, Knight S, Alessandrini EA, et al. Lack of agreement in pediatric emergency department discharge diagnoses from clinical and administrative data sources. Acad Emerg Med 2007;14(7):646-652. https://doi.org/10.1197/j.aem.2007.03.1357        [ Links ]

    8. So L, Beck CA, Brien S, et al. Chart documentation quality and its relationship to the validity of administrative data discharge records. Health Informatics J 2010;16(2):101-113. https://doi.org/10.1177/1460458210364784        [ Links ]

    9. National Department of Health. South African ICD-10 Technical User Guide. Version 2. Pretoria: NDoH, 2014.         [ Links ]

    10. Howard AE, Courtney-Shapiro C, Kelso LA, Goltz M, Morris PE. Comparison of 3 methods of detecting acute respiratory distress syndrome. Clinical screening, chart review, and diagnostic coding Am J Crit Care 2004;13(1):59-64.         [ Links ]

    11. Chin N, Perera P, Roberts A, Nagappan R. Review of medical discharge summaries and medical documentation in a metropolitan hospital. Impact on diagnostic-related groups and weighted inlier equivalent separation. Intern Med J 2013;43(7):767-771. https://doi.org/10.1111/imj.12084        [ Links ]

    12. Joliey RJ, Quan H, Jette N, et al. Validation and optimisation of an ICD-10-coded case definition for sepsis using administrative health data. BMJ Open 2015;5(12). https://doi.org/10.1136/bmjopen-2015-009487        [ Links ]

    13. Martin BJ, Chen G, Graham M, Quan H. Coding of obesity in administrative hospital discharge abstract data. Accuracy and impact for future research studies. BMC Health Serv Res 2014;14:70. https://doi.org/10.1186/1472-6963-14-70        [ Links ]

    14. McCormick N, Lacailie D, Bhole V, Avina-Zubieta JA. Validity of heart failure diagnoses in administrative databases. A systematic review and meta-analysis. PLoS ONE 2014;9(8):e104519. https://doi.org/10.1371/journai.pone.0104519        [ Links ]

    15. McCormick N, Lacailie D, Bhole V, Avina-Zubieta JA. Validity of myocardial infarction diagnoses in administrative databases. A systematic review. PLoS ONE 2014;9(3):e92286. https://doi.org/10.1371/journal.pone.0092286        [ Links ]

    16. McCormick N, Bhole V, Lacailie D, Avina-Zubieta JA. Validity of diagnostic codes for acute stroke in administrative databases. A systematic review. PLoS ONE 2015;10(8):e0135834. https://doi.org/10.1371/journai.pone.0135834        [ Links ]

    17. Chute CG, Huff SM, Ferguson JA, Walker JM, Haiamka JD. There are important reasons for delaying implementation of the new ICD-10 coding system. Health Affairs 2012;31(4):836-842. https://doi.org/10.1377/hlthaff.2011.1258        [ Links ]

    18. Januel JM, Luthi JC, Quan H, et ai. Improved accuracy of co-morbidity coding over time after the introduction of ICD-10 administrative data. BMC Health Serv Res 2011;11:194. https://doi.org/10.1186/1472-6963-11-194        [ Links ]

    19. Hennessy DA, Quan H, Faris PD, Beck CA. Do coder characteristics influence validity of ICD-10 hospital discharge data? BMC Health Serv Res 2010;10:99. https://doi.org/10.1186/1472-6963-10-99        [ Links ]

    20. Rampatige R, Gamage S, Peiris S, Lopez AD. Assessing the reliability of causes of death reported by the vital registration system in Sri Lanka. Medical records review in Colombo. Health Inf Manag 2013;42(3):20-28. https://doi.org/10.1177/183335831304200302        [ Links ]

    21. Rao C, Yang G, Hu J, Ma J, Xia W, Lopez AD. Validation of cause-of-death statistics in urban China. Int J Epidemiol 2007;36(3):642-651. https://doi.org/10.1093/ije/dym003        [ Links ]

    22. World Health Organization. ICD-10 online training tool. 2010. http://apps.who.int/classifications/apps/icd/ICD10Trainmg/ICD-10%20training/Start/index.html (accessed 15 December 2020).         [ Links ]

    23. Dyers R, Ward G, du Plooy S, Fourie S, Evans J, Mahomed H. Training and support to improve ICE coding quality. A controlled before-and-after impact evaluation. S Afr Med J 2017;107(6):501-506. https://doi.org/10.7196/samj.2017.v107i6.12075        [ Links ]

     

     

    Correspondence:
    H Buys

    heloise.buys@uct.ac.za

    Accepted 18 August 2020