{"doc_id": 0, "doc": {"pubid": 21645374, "question": "Do mitochondria play a role in remodelling lace plant leaves during programmed cell death?", "context": {"contexts": ["Programmed cell death (PCD) is the regulated death of cells within an organism. The lace plant (Aponogeton madagascariensis) produces perforations in its leaves through PCD. The leaves of the plant consist of a latticework of longitudinal and transverse veins enclosing areoles. PCD occurs in the cells at the center of these areoles and progresses outwards, stopping approximately five cells from the vasculature. The role of mitochondria during PCD has been recognized in animals; however, it has been less studied during PCD in plants.", "The following paper elucidates the role of mitochondrial dynamics during developmentally regulated PCD in vivo in A. madagascariensis. A single areole within a window stage leaf (PCD is occurring) was divided into three areas based on the progression of PCD; cells that will not undergo PCD (NPCD), cells in early stages of PCD (EPCD), and cells in late stages of PCD (LPCD). Window stage leaves were stained with the mitochondrial dye MitoTracker Red CMXRos and examined. Mitochondrial dynamics were delineated into four categories (M1-M4) based on characteristics including distribution, motility, and membrane potential (ΔΨm). A TUNEL assay showed fragmented nDNA in a gradient over these mitochondrial stages. Chloroplasts and transvacuolar strands were also examined using live cell imaging. The possible importance of mitochondrial permeability transition pore (PTP) formation during PCD was indirectly examined via in vivo cyclosporine A (CsA) treatment. This treatment resulted in lace plant leaves with a significantly lower number of perforations compared to controls, and that displayed mitochondrial dynamics similar to that of non-PCD cells."], "labels": ["BACKGROUND", "RESULTS"], "meshes": ["Alismataceae", "Apoptosis", "Cell Differentiation", "Mitochondria", "Plant Leaves"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["y", "e", "s"]}, "long_answer": "Results depicted mitochondrial dynamics in vivo as PCD progresses within the lace plant, and highlight the correlation of this organelle with other organelles during developmental PCD. To the best of our knowledge, this is the first report of mitochondria and chloroplasts moving on transvacuolar strands to form a ring structure surrounding the nucleus during developmental PCD. Also, for the first time, we have shown the feasibility for the use of CsA in a whole plant system. Overall, our findings implicate the mitochondria as playing a critical and early role in developmentally regulated PCD in the lace plant.", "final_decision": "yes"}, "target": "yes", "arguments": ["Abstract: The data analysis was conducted to describe the rate of unsuccessful copper T380A intrauterine device (IUD) insertions among women using the IUD for emergency contraception (EC) at community family planning clinics in Utah.\nThese data were obtained from a prospective observational trial of women choosing the copper T380A IUD for EC. Insertions were performed by nurse practitioners at two family planning clinics in order to generalize findings to the type of service setting most likely to employ this intervention. Adjuvant measures to facilitate difficult IUD insertions (cervical anesthesia, dilation, pain medication, and use of ultrasound guidance) were not utilized. The effect of parity on IUD insertion success was determined using exact logistic regression models adjusted for individual practitioner failure rates.\nSix providers performed 197 IUD insertion attempts. These providers had a mean of 14.1 years of experience (range 1-27, S.D. ±12.5). Among nulliparous women, 27 of 138 (19.6%) IUD insertions were unsuccessful. In parous women, 8 of 59 IUD insertions were unsuccessful (13.6%). The adjusted odds ratio (aOR) showed that IUD insertion failure was more likely in nulliparous women compared to parous women (aOR=2.31, 95% CI 0.90-6.52, p=.09).\nQuestion: Failed IUD insertions in community practice: an under-recognized problem?\nAnswer: yes\n\nAbstract: The aim of this study was to analyze the properties of the immune cell microenvironment of regional lymph nodes (LNs) positive for lung cancer.\nTwenty-four patients operated on for stages T1 and T2 of the NSCLC, were enrolled in the study. Peripheral blood and LN tissue were obtained from different lymph node sites and levels. As a control, LN tissue was taken from patients diagnosed with emphysema or pneumothorax. The cells from randomly chosen LN were tested by multi-color flow cytometry. Separate portions of LN were snap-frozen and examined for the presence of cytokeratin positive cells (CK). Propensity for apoptosis, level of TCR zeta chain expression of T cells and the number and maturation status of dendritic cells were confronted with the presence of CK-positive cells.\nThe presence of metastases correlated with the downregulation of TCR zeta, especially CD8(+) T cells. The most striking feature was the reduction in the number of myeloid CD11c(+) dendritic cells in the LN of patients with LN metastases. This could be a reflection of the immunodeficient state observed in lung cancer patients. Even in the absence of metastases in the regional LN, the same type of changes in the LN microenvironment were observed in those LN located nearer the primary tumor.\nQuestion: Can the condition of the cell microenvironment of mediastinal lymph nodes help predict the risk of metastases in non-small cell lung cancer?\nAnswer: yes\n\nAbstract: The objective of the current study is to determine to what extent the reduction of Chile's traffic fatalities and injuries during 2000-2012 was related to the police traffic enforcement increment registered after the introduction of its 2005 traffic law reform.\nA unique dataset with assembled information from public institutions and analyses based on ordinary least square and robust random effects models was carried out. Dependent variables were traffic fatality and severe injury rates per population and vehicle fleet. Independent variables were: (1) presence of new national traffic law; (2) police officers per population; (3) number of traffic tickets per police officer; and (4) interaction effect of number of traffic tickets per police officer with traffic law reform. Oil prices, alcohol consumption, proportion of male population 15-24 years old, unemployment, road infrastructure investment, years' effects and regions' effects represented control variables.\nEmpirical estimates from instrumental variables suggest that the enactment of the traffic law reform in interaction with number of traffic tickets per police officer is significantly associated with a decrease of 8% in traffic fatalities and 7% in severe injuries. Piecewise regression model results for the 2007-2012 period suggest that police traffic enforcement reduced traffic fatalities by 59% and severe injuries by 37%.\nQuestion: Did Chile's traffic law reform push police enforcement?\nAnswer: yes\n\nAbstract: To compare atropine with placebo as an adjunct to ketamine sedation in children undergoing minor painful procedures. Outcome measures included hypersalivation, side effect profile, parental/patient satisfaction, and procedural success rate.\nChildren aged between 1 and 16 years of age requiring ketamine procedural sedation in a tertiary emergency department were randomised to receive 0.01 mg/kg of atropine or placebo. All received 4 mg/kg of intramuscular ketamine. Tolerance and sedation scores were recorded throughout the procedure. Side effects were recorded from the start of sedation until discharge. Parental and patient satisfaction scores were obtained at discharge and three to five days after the procedure, with the opportunity to report side effects encountered at home.\nA total of 83 patients aged 13 months to 14.5 years (median age 3.4 years) were enrolled over a 16 month period. Hypersalivation occurred in 11.4% of patients given atropine compared with 30.8% given placebo (odds ratio (OR) 0.29, 95% confidence interval (CI) 0.09 to 0.91). A transient rash was observed in 22.7% of the atropine group compared with 5.1% of the placebo group (OR 5.44, 95% CI 1.11 to 26.6). Vomiting during recovery occurred in 9.1% of atropine patients compared with 25.6% of placebo patients (OR 0.29, 95% CI 0.09 to 1.02). There was a trend towards better tolerance in the placebo group. No patient experienced serious side effects.\nQuestion: Is atropine needed with ketamine sedation?\nAnswer: yes\n\nAbstract: Programmed cell death (PCD) is the regulated death of cells within an organism. The lace plant (Aponogeton madagascariensis) produces perforations in its leaves through PCD. The leaves of the plant consist of a latticework of longitudinal and transverse veins enclosing areoles. PCD occurs in the cells at the center of these areoles and progresses outwards, stopping approximately five cells from the vasculature. The role of mitochondria during PCD has been recognized in animals; however, it has been less studied during PCD in plants.\nThe following paper elucidates the role of mitochondrial dynamics during developmentally regulated PCD in vivo in A. madagascariensis. A single areole within a window stage leaf (PCD is occurring) was divided into three areas based on the progression of PCD; cells that will not undergo PCD (NPCD), cells in early stages of PCD (EPCD), and cells in late stages of PCD (LPCD). Window stage leaves were stained with the mitochondrial dye MitoTracker Red CMXRos and examined. Mitochondrial dynamics were delineated into four categories (M1-M4) based on characteristics including distribution, motility, and membrane potential (ΔΨm). A TUNEL assay showed fragmented nDNA in a gradient over these mitochondrial stages. Chloroplasts and transvacuolar strands were also examined using live cell imaging. The possible importance of mitochondrial permeability transition pore (PTP) formation during PCD was indirectly examined via in vivo cyclosporine A (CsA) treatment. This treatment resulted in lace plant leaves with a significantly lower number of perforations compared to controls, and that displayed mitochondrial dynamics similar to that of non-PCD cells.\nQuestion: Do mitochondria play a role in remodelling lace plant leaves during programmed cell death?\nAnswer:", " yes"], "resps": [[[-1.6232320070266724, false]], [[-1.1910191774368286, true]], [[-9.578462600708008, false]]], "filtered_resps": [[-1.6232320070266724, false], [-1.1910191774368286, true], [-9.578462600708008, false]], "acc": 0.0} {"doc_id": 8, "doc": {"pubid": 17113061, "question": "Do mutations causing low HDL-C promote increased carotid intima-media thickness?", "context": {"contexts": ["Although observational data support an inverse relationship between high-density lipoprotein (HDL) cholesterol and coronary heart disease (CHD), genetic HDL deficiency states often do not correlate with premature CHD.", "Carotid intima-media thickness (cIMT) measurements were obtained in cases comprising 10 different mutations in LCAT, ABCA1 and APOA1 to further evaluate the relationship between low HDL resulting from genetic variation and early atherosclerosis.", "In a 1:2 case-control study of sex and age-related (+/-5 y) subjects (n=114), cIMT was nearly identical between cases (0.66+/-0.17 cm) and controls (0.65+/-0.18 cm) despite significantly lower HDL cholesterol (0.67 vs. 1.58 mmol/l) and apolipoprotein A-I levels (96.7 vs. 151.4 mg/dl) (P<0.05)"], "labels": ["BACKGROUND", "METHODS", "RESULTS"], "meshes": ["Cholesterol, HDL", "Contrast Media", "Coronary Disease", "Female", "Humans", "Male", "Mutation", "Risk Factors"], "reasoning_required_pred": ["n", "o"], "reasoning_free_pred": ["n", "o"]}, "long_answer": "Genetic variants identified in the present study may be insufficient to promote early carotid atherosclerosis.", "final_decision": "no"}, "target": "no", "arguments": ["Abstract: Recent studies have implicated the human cytomegalovirus (HCMV) as a possible pathogen for causing hypertension. We aimed to study the association between HCMV infection and hypertension in the United States National Health and Nutrition Examination Survey (NHANES).\nWe analyzed data on 2979 men and 3324 women in the NHANES 1999-2002. We included participants aged 16-49 years who had valid data on HCMV infection and hypertension.\nOf the participants, 54.7% had serologic evidence of HCMV infection and 17.5% had hypertension. There were ethnic differences in the prevalence of HCMV infection (P<0.001) and hypertension (P<0.001). The prevalence of both increased with age (P<0.001). Before adjustment, HCMV seropositivity was significantly associated with hypertension in women (OR=1.63, 95% CI=1.25-2.13, P=0.001) but not in men. After adjustment for race/ethnicity, the association between HCMV seropositivity and hypertension in women remained significant (OR=1.55, 95% CI=1.20-2.02, P=0.002). Further adjustment for body mass index, diabetes status and hypercholesterolemia attenuated the association (OR=1.44, 95% CI=1.10-1.90, P=0.010). However, after adjusting for age, the association was no longer significant (OR=1.24, 95% CI=0.91-1.67, P=0.162).\nQuestion: Is human cytomegalovirus infection associated with hypertension?\nAnswer: no\n\nAbstract: Cardiovascular disease is prevalent among workers with high levels of occupational physical activity. The increased risk may be due to a high relative aerobic workload, possibly leading to increased blood pressure. However, studies investigating the relation between relative aerobic workload and ambulatory blood pressure (ABP) are lacking. The aim was to explore the relationship between objectively measured relative aerobic workload and ABP.\nA total of 116 cleaners aged 18-65 years were included after informed consent was obtained. A portable device (Spacelabs 90217) was mounted for 24-h measurements of ABP, and an Actiheart was mounted for 24-h heart rate measurements to calculate relative aerobic workload as percentage of relative heart rate reserve. A repeated-measure multi-adjusted mixed model was applied for analysis.\nA fully adjusted mixed model of measurements throughout the day showed significant positive relations (p<0.001): a 1% increase in mean relative aerobic workload was associated with an increase of 0.42 ± 0.05 mmHg (95% CI 0.32-0.52 mmHg) in systolic ABP and 0.30 ± 0.04 mmHg (95% CI 0.22-0.38 mmHg) in diastolic ABP. Correlations between relative aerobic workload and ABP were significant.\nQuestion: Is aerobic workload positively related to ambulatory blood pressure?\nAnswer: yes\n\nAbstract: To evaluate the effectiveness of the role of a discharge coordinator whose sole responsibility was to plan and coordinate the discharge of patients from medical wards.\nAn intervention study in which the quality of discharge planning was assessed before and after the introduction of a discharge coordinator. Patients were interviewed on the ward before discharge and seven to 10 days after being discharged home.\nThe three medical wards at the Homerton Hospital in Hackney, East London.\n600 randomly sampled adult patients admitted to the medical wards of the study hospital, who were resident in the district (but not in institutions), were under the care of physicians (excluding psychiatry), and were discharged home from one of the medical wards. The sampling was conducted in three study phases, over 18 months.\nPhase I comprised base line data collection; in phase II data were collected after the introduction of the district discharge planning policy and a discharge form (checklist) for all patients; in phase III data were collected after the introduction of the discharge coordinator.\nThe quality and out come of discharge planning. Readmission rates, duration of stay, appropriateness of days of care, patients' health and satisfaction, problems after discharge, and receipt of services.\nThe discharge coordinator resulted in an improved discharge planning process, and there was a reduction in problems experienced by patients after discharge, and in perceived need for medical and healthcare services. There was no evidence that the discharge coordinator resulted in a more timely or effective provision of community services after discharge, or that the appropriateness or efficiency of bed use was improved.\nQuestion: Does a dedicated discharge coordinator improve the quality of hospital discharge?\nAnswer: yes\n\nAbstract: We observed an endoscopic abnormally in a group of children with histological esophagitis. We termed this finding \"vertical lines in esophageal mucosa\" (VLEM). We examined the relationship between the presence of VLEM and significant histologic changes in esophageal mucosal biopsies.\nBetween January 1, 1992, and August 31, 1994, the senior author (JFF) performed 255 esophageal biopsies. The procedure reports, available endoscopic photographs, and histology reports were reviewed to establish the endoscopic and histologic appearance of the esophageal mucosa. Intraepithelial cells were counted in a blind review of 42 randomly selected biopsies.\nThe esophageal mucosa had a normal appearance on 160 endoscopic studies (Group 1) and VLEM were the only mucosal abnormalities in 41 endoscopies (Group 2). Histology was normal in 92 of 160 biopsies (57.5%) from Group 1, and 1 of 41 biopsies (2.4%) from Group 2. Most patients in Group 2 had eosinophilic esophagitis (34 of 41, 83%, specificity 0.85, sensitivity 0.5, p>0.001) which was of moderate to severe intensity (31 of 34, 91.2%, specificity 0.88, sensitivity 0.73, p<0.001).\nQuestion: Vertical lines in distal esophageal mucosa (VLEM): a true endoscopic manifestation of esophagitis in children?\nAnswer: yes\n\nAbstract: Although observational data support an inverse relationship between high-density lipoprotein (HDL) cholesterol and coronary heart disease (CHD), genetic HDL deficiency states often do not correlate with premature CHD.\nCarotid intima-media thickness (cIMT) measurements were obtained in cases comprising 10 different mutations in LCAT, ABCA1 and APOA1 to further evaluate the relationship between low HDL resulting from genetic variation and early atherosclerosis.\nIn a 1:2 case-control study of sex and age-related (+/-5 y) subjects (n=114), cIMT was nearly identical between cases (0.66+/-0.17 cm) and controls (0.65+/-0.18 cm) despite significantly lower HDL cholesterol (0.67 vs. 1.58 mmol/l) and apolipoprotein A-I levels (96.7 vs. 151.4 mg/dl) (P<0.05)\nQuestion: Do mutations causing low HDL-C promote increased carotid intima-media thickness?\nAnswer:", " yes"], "resps": [[[-1.337092638015747, false]], [[-0.7598450779914856, true]], [[-9.417948722839355, false]]], "filtered_resps": [[-1.337092638015747, false], [-0.7598450779914856, true], [-9.417948722839355, false]], "acc": 1.0} {"doc_id": 1, "doc": {"pubid": 16418930, "question": "Landolt C and snellen e acuity: differences in strabismus amblyopia?", "context": {"contexts": ["Assessment of visual acuity depends on the optotypes used for measurement. The ability to recognize different optotypes differs even if their critical details appear under the same visual angle. Since optotypes are evaluated on individuals with good visual acuity and without eye disorders, differences in the lower visual acuity range cannot be excluded. In this study, visual acuity measured with the Snellen E was compared to the Landolt C acuity.", "100 patients (age 8 - 90 years, median 60.5 years) with various eye disorders, among them 39 with amblyopia due to strabismus, and 13 healthy volunteers were tested. Charts with the Snellen E and the Landolt C (Precision Vision) which mimic the ETDRS charts were used to assess visual acuity. Three out of 5 optotypes per line had to be correctly identified, while wrong answers were monitored. In the group of patients, the eyes with the lower visual acuity, and the right eyes of the healthy subjects, were evaluated.", "Differences between Landolt C acuity (LR) and Snellen E acuity (SE) were small. The mean decimal values for LR and SE were 0.25 and 0.29 in the entire group and 0.14 and 0.16 for the eyes with strabismus amblyopia. The mean difference between LR and SE was 0.55 lines in the entire group and 0.55 lines for the eyes with strabismus amblyopia, with higher values of SE in both groups. The results of the other groups were similar with only small differences between LR and SE."], "labels": ["BACKGROUND", "PATIENTS AND METHODS", "RESULTS"], "meshes": ["Adolescent", "Adult", "Aged", "Aged, 80 and over", "Amblyopia", "Cataract", "Child", "Eye Diseases", "Female", "Humans", "Male", "Middle Aged", "Reference Values", "Refractive Errors", "Reproducibility of Results", "Retinal Diseases", "Strabismus", "Vision Tests", "Visual Acuity"], "reasoning_required_pred": ["n", "o"], "reasoning_free_pred": ["n", "o"]}, "long_answer": "Using the charts described, there was only a slight overestimation of visual acuity by the Snellen E compared to the Landolt C, even in strabismus amblyopia. Small differences in the lower visual acuity range have to be considered.", "final_decision": "no"}, "target": "no", "arguments": ["Abstract: Current guidelines recommend total thyroidectomy for nearly all children with well-differentiated thyroid cancer (WDTC). These guidelines, however, derive from older data accrued prior to current high-resolution imaging. We speculate that there is a subpopulation of children who may be adequately treated with lobectomy.\nRetrospective analysis of prospectively maintained database.\nSeventy-three children with WDTC treated between 2004 and 2015.\nWe applied two different risk-stratification criteria to this population. First, we determined the number of patients meeting American Thyroid Association (ATA) 'low-risk' criteria, defined as disease grossly confined to the thyroid with either N0/Nx or incidental microscopic N1a disease. Second, we defined a set of 'very-low-risk' histopathological criteria, comprising unifocal tumours ≤4 cm without predefined high-risk factors, and determined the proportion of patients that met these criteria.\nTwenty-seven (37%) males and 46 (63%) females were included in this study, with a mean age of 13·4 years. Ipsilateral- and contralateral multifocality were identified in 27 (37·0%) and 19 (26·0%) of specimens. Thirty-seven (51%) patients had lymph node metastasis (N1a = 18/N1b = 19). Pre-operative ultrasound identified all cases with clinically significant nodal disease. Of the 73 patients, 39 (53·4%) met ATA low-risk criteria and 16 (21·9%) met 'very-low-risk' criteria. All 'very-low-risk' patients demonstrated excellent response to initial therapy without persistence/recurrence after a mean follow-up of 36·4 months.\nQuestion: Is it time to reconsider lobectomy in low-risk paediatric thyroid cancer?\nAnswer: yes\n\nAbstract: Little is known about how information needs change over time in the early postpartum period or about how these needs might differ given socioeconomic circumstances. This study's aim was to examine women's concerns at the time of hospital discharge and unmet learning needs as self-identified at 4 weeks after discharge.\nData were collected as part of a cross-sectional survey of postpartum health outcomes, service use, and costs of care in the first 4 weeks after postpartum hospital discharge. Recruitment of 250 women was conducted from each of 5 hospitals in Ontario, Canada (n = 1,250). Women who had given vaginal birth to a single live infant, and who were being discharged at the same time as their infant, assuming care of their infant, competent to give consent, and able to communicate in one of the study languages were eligible. Participants completed a self-report questionnaire in hospital; 890 (71.2%) took part in a structured telephone interview 4 weeks after hospital discharge.\nApproximately 17 percent of participants were of low socioeconomic status. Breastfeeding and signs of infant illness were the most frequently identified concerns by women, regardless of their socioeconomic status. Signs of infant illness and infant care/behavior were the main unmet learning needs. Although few differences in identified concerns were evident, women of low socioeconomic status were significantly more likely to report unmet learning needs related to 9 of 10 topics compared with women of higher socioeconomic status. For most topics, significantly more women of both groups identified learning needs 4 weeks after discharge compared with the number who identified corresponding concerns while in hospital.\nQuestion: Learning needs of postpartum women: does socioeconomic status matter?\nAnswer: yes\n\nAbstract: To determine the ability of early sonogram to predict the presentation of twin A at birth.\nA retrospective cohort study was conducted on all twin pregnancies evaluated at our Fetal Evaluation Unit from 2007 to 2009. Sonogram records were reviewed for the presentation of twin A at seven gestational age intervals and inpatient medical records were reviewed for the presentation of twin A at delivery. The positive predictive value, sensitivity, and specificity of presentation as determined by ultrasound, at each gestational age interval, for the same presentation at delivery were calculated.\nTwo hundred and thirty-eight twin pregnancies met inclusion criteria. A total of 896 ultrasounds were reviewed. The positive predictive value of cephalic presentation of twin A as determined by ultrasound for the persistence of cephalic presentation at delivery reached 95% after 28 weeks gestation. The positive predictive value for noncephalic presentation as established by sonogram for noncephalic at delivery was>90% after 32 weeks gestation.\nQuestion: Can third trimester ultrasound predict the presentation of the first twin at delivery?\nAnswer: yes\n\nAbstract: To evaluate the impact of patient-prosthesis mismatch (PPM) on survival, functional status, and quality of life (QoL) after aortic valve replacement (AVR) with small prosthesis size in elderly patients.\nBetween January 2005 and December 2013, 152 patients with pure aortic stenosis, aged at least 75 years, underwent AVR, with a 19 or 21 mm prosthetic heart valve. PPM was defined as an indexed effective orifice area less than 0.85 cm/m. Median age was 82 years (range 75-93 years). Mean follow-up was 56 months (range 1-82 months) and was 98% complete. Late survival rate, New York Heart Association functional class, and QoL (RAND SF-36) were assessed.\nOverall, PPM was found in 78 patients (53.8%). Among them, 42 patients (29%) had an indexed effective orifice area less than 0.75 cm/m and 17 less than 0.65 cm/m (11.7%). Overall survival at 5 years was 78 ± 4.5% and was not influenced by PPM (P = NS). The mean New York Heart Association class for long-term survivors with PPM improved from 3.0 to 1.7 (P < 0.001). QoL (physical functioning 45.18 ± 11.35, energy/fatigue 49.36 ± 8.64, emotional well being 58.84 ± 15.44, social functioning 61.29 ± 6.15) was similar to that of no-PPM patients (P = NS).\nQuestion: Does patient-prosthesis mismatch after aortic valve replacement affect survival and quality of life in elderly patients?\nAnswer: no\n\nAbstract: Assessment of visual acuity depends on the optotypes used for measurement. The ability to recognize different optotypes differs even if their critical details appear under the same visual angle. Since optotypes are evaluated on individuals with good visual acuity and without eye disorders, differences in the lower visual acuity range cannot be excluded. In this study, visual acuity measured with the Snellen E was compared to the Landolt C acuity.\n100 patients (age 8 - 90 years, median 60.5 years) with various eye disorders, among them 39 with amblyopia due to strabismus, and 13 healthy volunteers were tested. Charts with the Snellen E and the Landolt C (Precision Vision) which mimic the ETDRS charts were used to assess visual acuity. Three out of 5 optotypes per line had to be correctly identified, while wrong answers were monitored. In the group of patients, the eyes with the lower visual acuity, and the right eyes of the healthy subjects, were evaluated.\nDifferences between Landolt C acuity (LR) and Snellen E acuity (SE) were small. The mean decimal values for LR and SE were 0.25 and 0.29 in the entire group and 0.14 and 0.16 for the eyes with strabismus amblyopia. The mean difference between LR and SE was 0.55 lines in the entire group and 0.55 lines for the eyes with strabismus amblyopia, with higher values of SE in both groups. The results of the other groups were similar with only small differences between LR and SE.\nQuestion: Landolt C and snellen e acuity: differences in strabismus amblyopia?\nAnswer:", " yes"], "resps": [[[-2.011251926422119, false]], [[-0.9465469121932983, true]], [[-10.619100570678711, false]]], "filtered_resps": [[-2.011251926422119, false], [-0.9465469121932983, true], [-10.619100570678711, false]], "acc": 1.0} {"doc_id": 9, "doc": {"pubid": 10966337, "question": "A short stay or 23-hour ward in a general and academic children's hospital: are they effective?", "context": {"contexts": ["We evaluated the usefulness of a short stay or 23-hour ward in a pediatric unit of a large teaching hospital, Westmead Hospital, and an academic Children's hospital, The New Children's Hospital, to determine if they are a useful addition to the emergency service.", "This is a descriptive comparison of prospectively collected data on all children admitted to the short stay ward at Westmead Hospital (WH) during 1994 and the short stay ward at the New Children's Hospital (NCH) during 1997-98. These hospitals service an identical demographic area with the latter (NCH) a tertiary referral center. The following outcome measures were used: length of stay, appropriateness of stay, rate of admission to an in-hospital bed, and rate of unscheduled visits within 72 hours of discharge. Adverse events were reported and patient follow-up was attempted at 48 hours after discharge in all cases.", "The short stay ward accounted for 10.3% (Westmead Hospital) and 14.7% (New Children's Hospital) of admissions, with 56% medical in nature, 30% surgical, and the remainder procedural or psychological. Admission patterns were similar, with asthma, gastroenteritis, convulsion, pneumonia, and simple surgical conditions accounting for most short stay ward admissions. The short stay ward increased hospital efficiency with an average length of stay of 17.5 hours (Westmead Hospital) compared to 20.5 hours (New Children's Hospital). The users of the short stay ward were children of young age less than 2 years, with stay greater than 23 hours reported in only 1% of all admissions to the short stay ward. The rate of patient admission to an in-hospital bed was low, (4% [Westmead Hospital] compared to 6% [New Children's Hospital]), with the number of unscheduled visits within 72 hours of short stay ward discharge less than 1%. There were no adverse events reported at either short stay ward, with parental satisfaction high. The short stay ward was developed through reallocation of resources from within the hospital to the short stay ward. This resulted in estimated savings of $1/2 million (Westmead Hospital) to $2.3 million (New Children's Hospital) to the hospital, due to more efficient bed usage."], "labels": ["OBJECTIVE", "METHODS", "RESULTS"], "meshes": ["Academic Medical Centers", "Acute Disease", "Adolescent", "Child", "Child, Preschool", "Critical Pathways", "Emergency Service, Hospital", "Follow-Up Studies", "Hospital Units", "Hospitals, General", "Hospitals, Pediatric", "Humans", "Infant", "Length of Stay", "New South Wales", "Outcome Assessment (Health Care)", "Pediatrics", "Prospective Studies", "Time Factors"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["y", "e", "s"]}, "long_answer": "This data demonstrates the robust nature of the short stay ward. At these two very different institutions we have shown improved bed efficient and patient care in a cost-effective way. We have also reported on greater parental satisfaction and early return of the child with their family to the community.", "final_decision": "yes"}, "target": "yes", "arguments": ["Abstract: The aims of the study were to report the rates of recurrent and residual cholesteatoma following primary CAT surgery and to report the rate of conversion to a modified radical mastoidectomy.\nThis was a retrospective review of a single surgeon series between 2006 and 2012.\nIn total 132 second-look operations were undertaken, with a mean interval between primary surgery and second-look procedures of 6 months. The rate of cholesteatoma at second-look surgery was 19.7%, which was split into residual disease (10.6%) and recurrent disease (9.09%). New tympanic membrane defects with cholesteatoma were considered as recurrent disease. Residual disease was defined as cholesteatoma present behind an intact tympanic membrane. The majority of recurrent and residual disease was easily removed at second look (73.1%). Only four cases were converted to a modified radical mastoidectomy (3%) and three cases required a third-look procedure.\nQuestion: Can early second-look tympanoplasty reduce the rate of conversion to modified radical mastoidectomy?\nAnswer: yes\n\nAbstract: To determine the ability of early sonogram to predict the presentation of twin A at birth.\nA retrospective cohort study was conducted on all twin pregnancies evaluated at our Fetal Evaluation Unit from 2007 to 2009. Sonogram records were reviewed for the presentation of twin A at seven gestational age intervals and inpatient medical records were reviewed for the presentation of twin A at delivery. The positive predictive value, sensitivity, and specificity of presentation as determined by ultrasound, at each gestational age interval, for the same presentation at delivery were calculated.\nTwo hundred and thirty-eight twin pregnancies met inclusion criteria. A total of 896 ultrasounds were reviewed. The positive predictive value of cephalic presentation of twin A as determined by ultrasound for the persistence of cephalic presentation at delivery reached 95% after 28 weeks gestation. The positive predictive value for noncephalic presentation as established by sonogram for noncephalic at delivery was>90% after 32 weeks gestation.\nQuestion: Can third trimester ultrasound predict the presentation of the first twin at delivery?\nAnswer: yes\n\nAbstract: To assess the results of transsphenoidal pituitary surgery in patients with Cushing's disease over a period of 18 years, and to determine if there are factors which will predict the outcome.\nSixty-nine sequential patients treated surgically by a single surgeon in Newcastle upon Tyne between 1980 and 1997 were identified and data from 61 of these have been analysed.\nRetrospective analysis of outcome measures.\nPatients were divided into three groups (remission, failure and relapse) depending on the late outcome of their treatment as determined at the time of analysis, i.e. 88 months (median) years after surgery. Remission is defined as biochemical reversal of hypercortisolism with re-emergence of diurnal circadian rhythm, resolution of clinical features and adequate suppression on low-dose dexamethasone testing. Failure is defined as the absence of any of these features. Relapse is defined as the re-emergence of Cushing's disease more than one year after operation. Clinical features such as weight, sex, hypertension, associated endocrine disorders and smoking, biochemical studies including preoperative and postoperative serum cortisol, urine free cortisol, serum ACTH, radiological, histological and surgical findings were assessed in relation to these three groups to determine whether any factors could reliably predict failure or relapse after treatment.\nOf the 61 patients included in this study, 48 (78.7%) achieved initial remission and 13 (21.3%) failed treatment. Seven patients suffered subsequent relapse (range 22-158 months) in their condition after apparent remission, leaving a final group of 41 patients (67.2%) in the remission group. Tumour was identified at surgery in 52 patients, of whom 38 achieved remission. In comparison, only 3 of 9 patients in whom no tumour was identified achieved remission. This difference was significant (P = 0.048). When both radiological and histological findings were positive, the likelihood of achieving remission was significantly higher than if both modalities were negative (P = 0.038). There were significant differences between remission and failure groups when 2- and 6-week postoperative serum cortisol levels (P = 0.002 and 0.001, respectively) and 6-week postoperative urine free cortisol levels (P = 0.026) were compared. This allowed identification of patients who failed surgical treatment in the early postoperative period. Complications of surgery included transitory DI in 13, transitory CSF leak in 8 and transitory nasal discharge and cacosmia in 3. Twelve of 41 patients required some form of hormonal replacement therapy despite achieving long-term remission. Thirteen patients underwent a second operation, of whom 5 achieved remission.\nQuestion: Transsphenoidal pituitary surgery in Cushing's disease: can we predict outcome?\nAnswer: yes\n\nAbstract: The objective was to evaluate the efficacy of diffusion-weighted imaging (DWI) in predicting the development of vascularization in hypovascular hepatocellular lesions (HHLs).\nForty-two HHLs that were diagnosed by computed tomographic (CT) arteriography were evaluated retrospectively. The lesion on DWI was classified as isointense, hypointense, or hyperintense. Follow-up studies that included intravenous dynamic CT or magnetic resonance imaging were performed.\nThe 730-day cumulative developments of vascularization in hypointense, isointense, and hyperintense lesions were 17%, 30%, and 40%, respectively. The differences among these developments were not statistically significant.\nQuestion: Is diffusion-weighted imaging a significant indicator of the development of vascularization in hypovascular hepatocellular lesions?\nAnswer: no\n\nAbstract: We evaluated the usefulness of a short stay or 23-hour ward in a pediatric unit of a large teaching hospital, Westmead Hospital, and an academic Children's hospital, The New Children's Hospital, to determine if they are a useful addition to the emergency service.\nThis is a descriptive comparison of prospectively collected data on all children admitted to the short stay ward at Westmead Hospital (WH) during 1994 and the short stay ward at the New Children's Hospital (NCH) during 1997-98. These hospitals service an identical demographic area with the latter (NCH) a tertiary referral center. The following outcome measures were used: length of stay, appropriateness of stay, rate of admission to an in-hospital bed, and rate of unscheduled visits within 72 hours of discharge. Adverse events were reported and patient follow-up was attempted at 48 hours after discharge in all cases.\nThe short stay ward accounted for 10.3% (Westmead Hospital) and 14.7% (New Children's Hospital) of admissions, with 56% medical in nature, 30% surgical, and the remainder procedural or psychological. Admission patterns were similar, with asthma, gastroenteritis, convulsion, pneumonia, and simple surgical conditions accounting for most short stay ward admissions. The short stay ward increased hospital efficiency with an average length of stay of 17.5 hours (Westmead Hospital) compared to 20.5 hours (New Children's Hospital). The users of the short stay ward were children of young age less than 2 years, with stay greater than 23 hours reported in only 1% of all admissions to the short stay ward. The rate of patient admission to an in-hospital bed was low, (4% [Westmead Hospital] compared to 6% [New Children's Hospital]), with the number of unscheduled visits within 72 hours of short stay ward discharge less than 1%. There were no adverse events reported at either short stay ward, with parental satisfaction high. The short stay ward was developed through reallocation of resources from within the hospital to the short stay ward. This resulted in estimated savings of $1/2 million (Westmead Hospital) to $2.3 million (New Children's Hospital) to the hospital, due to more efficient bed usage.\nQuestion: A short stay or 23-hour ward in a general and academic children's hospital: are they effective?\nAnswer:", " yes"], "resps": [[[-1.6780028343200684, false]], [[-0.6899580359458923, true]], [[-9.939301490783691, false]]], "filtered_resps": [[-1.6780028343200684, false], [-0.6899580359458923, true], [-9.939301490783691, false]], "acc": 0.0} {"doc_id": 2, "doc": {"pubid": 9488747, "question": "Syncope during bathing in infants, a pediatric form of water-induced urticaria?", "context": {"contexts": ["Apparent life-threatening events in infants are a difficult and frequent problem in pediatric practice. The prognosis is uncertain because of risk of sudden infant death syndrome.", "Eight infants aged 2 to 15 months were admitted during a period of 6 years; they suffered from similar maladies in the bath: on immersion, they became pale, hypotonic, still and unreactive; recovery took a few seconds after withdrawal from the bath and stimulation. Two diagnoses were initially considered: seizure or gastroesophageal reflux but this was doubtful. The hypothesis of an equivalent of aquagenic urticaria was then considered; as for patients with this disease, each infant's family contained members suffering from dermographism, maladies or eruption after exposure to water or sun. All six infants had dermographism. We found an increase in blood histamine levels after a trial bath in the two infants tested. The evolution of these \"aquagenic maladies\" was favourable after a few weeks without baths. After a 2-7 year follow-up, three out of seven infants continue to suffer from troubles associated with sun or water."], "labels": ["BACKGROUND", "CASE REPORTS"], "meshes": ["Baths", "Histamine", "Humans", "Infant", "Syncope", "Urticaria", "Water"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["y", "e", "s"]}, "long_answer": "\"Aquagenic maladies\" could be a pediatric form of the aquagenic urticaria.", "final_decision": "yes"}, "target": "yes", "arguments": ["Abstract: Blood stream infection (BSI) and the subsequent development of sepsis are among the most common infection complications occurring in severe burn patients. This study was designed to evaluate the relationship between the burn wound flora and BSI pathogens.\nDocumentation of all bacterial and fungal wound and blood isolates from severe burn patients hospitalized in the burn unit and intensive care unit was obtained from medical records retrieved retrospectively from a computerized, hospital-wide database over a 13-year period. All data were recorded in relation to the Ryan score.\nOf 195 severe burn patients, 88 had at least 1 BSI episode. Transmission of the same pathogen from wound to blood was documented in 30% of the patients, with a rising BSI frequency as the Ryan score increased. There were a total of 263 bacteremic episodes in 88 study patients, 44% of blood isolates were documented previously in wound cultures, and transmission of the same pathogen from wound to blood was noted in 65% of bacteremic patients.\nQuestion: Do Wound Cultures Give Information About the Microbiology of Blood Cultures in Severe Burn Patients?\nAnswer: yes\n\nAbstract: To determine if composite measures based on process indicators are consistent with short-term outcome indicators in surgical colorectal cancer care.\nLongitudinal analysis of consistency between composite measures based on process indicators and outcome indicators for 85 Dutch hospitals.\nThe Dutch Surgical Colorectal Audit database, the Netherlands.\n4732 elective patients with colon carcinoma and 2239 with rectum carcinoma treated in 85 hospitals were included in the analyses.\nAll available process indicators were aggregated into five different composite measures. The association of the different composite measures with risk-adjusted postoperative mortality and morbidity was analysed at the patient and hospital level.\nAt the patient level, only one of the composite measures was negatively associated with morbidity for rectum carcinoma. At the hospital level, a strong negative association was found between composite measures and hospital mortality and morbidity rates for rectum carcinoma (p<0.05), and hospital morbidity rates for colon carcinoma.\nQuestion: Combining process indicators to evaluate quality of care for surgical patients with colorectal cancer: are scores consistent with short-term outcome?\nAnswer: maybe\n\nAbstract: The effect of preoperative education on anxiety and postoperative outcomes of cardiac surgery patients remains unclear.AIM: The aim of the study was to estimate the effectiveness of a nurse-led preoperative education on anxiety and postoperative outcomes.\nA randomised controlled study was designed. All the patients who were admitted for elective cardiac surgery in a general hospital in Athens with knowledge of the Greek language were eligible to take part in the study. Patients in the intervention group received preoperative education by specially trained nurses. The control group received the standard information by the ward personnel. Measurements of anxiety were conducted on admission-A, before surgery-B and before discharge-C by the state-trait anxiety inventory.\nThe sample consisted of 395 patients (intervention group: 205, control group: 190). The state anxiety on the day before surgery decreased only in the intervention group (34.0 (8.4) versus 36.9 (10.7); P=0.001). The mean decrease in state score during the follow-up period was greater in the intervention group (P=0.001). No significant difference was found in the length of stay or readmission. Lower proportions of chest infection were found in the intervention group (10 (5.3) versus 1 (0.5); P=0.004). Multivariate linear regression revealed that education and score in trait anxiety scale on admission are independent predictors of a reduction in state anxiety.\nQuestion: Can nurse-led preoperative education reduce anxiety and postoperative complications of patients undergoing cardiac surgery?\nAnswer: yes\n\nAbstract: It is generally believed that positioning of the patient in a head-down tilt (Trendelenberg position) decreases the likelihood of a venous air embolism during liver resection.\nThe physiological effect of variation in horizontal attitude on central and hepatic venous pressure was measured in 10 patients during liver surgery. Hemodynamic indices were recorded with the operating table in the horizontal, 20 degrees head-up and 20 degrees head-down positions.\nThere was no demonstrable pressure gradient between the hepatic and central venous levels in any of the positions. The absolute pressures did, however, vary in a predictable way, being highest in the head-down and lowest during head-up tilt. However, on no occasion was a negative intraluminal pressure recorded.\nQuestion: Does patient position during liver surgery influence the risk of venous air embolism?\nAnswer: no\n\nAbstract: Apparent life-threatening events in infants are a difficult and frequent problem in pediatric practice. The prognosis is uncertain because of risk of sudden infant death syndrome.\nEight infants aged 2 to 15 months were admitted during a period of 6 years; they suffered from similar maladies in the bath: on immersion, they became pale, hypotonic, still and unreactive; recovery took a few seconds after withdrawal from the bath and stimulation. Two diagnoses were initially considered: seizure or gastroesophageal reflux but this was doubtful. The hypothesis of an equivalent of aquagenic urticaria was then considered; as for patients with this disease, each infant's family contained members suffering from dermographism, maladies or eruption after exposure to water or sun. All six infants had dermographism. We found an increase in blood histamine levels after a trial bath in the two infants tested. The evolution of these \"aquagenic maladies\" was favourable after a few weeks without baths. After a 2-7 year follow-up, three out of seven infants continue to suffer from troubles associated with sun or water.\nQuestion: Syncope during bathing in infants, a pediatric form of water-induced urticaria?\nAnswer:", " yes"], "resps": [[[-1.3580057621002197, false]], [[-0.8325159549713135, true]], [[-4.511409759521484, false], [-4.511409759521484, false], [-4.511409759521484, false], [-4.511409759521484, false]]], "filtered_resps": [[-1.3580057621002197, false], [-0.8325159549713135, true], [-4.511409759521484, false]], "acc": 0.0} {"doc_id": 3, "doc": {"pubid": 17208539, "question": "Are the long-term results of the transanal pull-through equal to those of the transabdominal pull-through?", "context": {"contexts": ["The transanal endorectal pull-through (TERPT) is becoming the most popular procedure in the treatment of Hirschsprung disease (HD), but overstretching of the anal sphincters remains a critical issue that may impact the continence. This study examined the long-term outcome of TERPT versus conventional transabdominal (ABD) pull-through for HD.", "Records of 41 patients more than 3 years old who underwent a pull-through for HD (TERPT, n = 20; ABD, n = 21) were reviewed, and their families were thoroughly interviewed and scored via a 15-item post-pull-through long-term outcome questionnaire. Patients were operated on between the years 1995 and 2003. During this time, our group transitioned from the ABD to the TERPT technique. Total scoring ranged from 0 to 40: 0 to 10, excellent; 11 to 20 good; 21 to 30 fair; 31 to 40 poor. A 2-tailed Student t test, analysis of covariance, as well as logistic and linear regression were used to analyze the collected data with confidence interval higher than 95%.", "Overall scores were similar. However, continence score was significantly better in the ABD group, and the stool pattern score was better in the TERPT group. A significant difference in age at interview between the 2 groups was noted; we therefore reanalyzed the data controlling for age, and this showed that age did not significantly affect the long-term scoring outcome between groups."], "labels": ["PURPOSE", "METHODS", "RESULTS"], "meshes": ["Child", "Child, Preschool", "Colectomy", "Female", "Hirschsprung Disease", "Humans", "Male", "Treatment Outcome"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["n", "o"]}, "long_answer": "Our long-term study showed significantly better (2-fold) results regarding the continence score for the abdominal approach compared with the transanal pull-through. The stool pattern and enterocolitis scores were somewhat better for the TERPT group. These findings raise an important issue about the current surgical management of HD; however, more cases will need to be studied before a definitive conclusion can be drawn.", "final_decision": "no"}, "target": "no", "arguments": ["Abstract: To test if secular growth acceleration occurs during fetal life.\nANOVA Kruskal-Wallis and Mann-Whitney U-test have been used for the biometric characteristics comparison of nowadays fetal population with those three decades ago and to test the hypothesis about the existence of secular growth acceleration during fetal life. For this purpose, we first calculated mean values of particular biometric parameters for the whole pregnancy. During the period 2002-2009 biparietal diameter, fetal length and abdominal circumference measurements in singleton uncomplicated pregnancies between 22 and 41 gestational weeks were obtained. Gestational age was estimated according to Naegele's rule and confirmed with an early ultrasound examination. Pregnancies with fetal cromosomopathies and malformations were excluded as well as those resulting in perinatal death.\nThere were no statistically significant differences of the examined fetal biometric parameters measured by ultrasound between contemporary fetal population and those from 35 years ago.\nQuestion: The secular growth acceleration: does it appear during fetal life?\nAnswer: no\n\nAbstract: Phacodonesis can occur in pseudoexfoliation syndrome because of impaired zonular support. This study investigates whether the increased mobility of the lens influences anterior chamber depth in patients with pseudoexfoliation while assuming a prone position.\nCentral anterior chamber depth was measured in 39 patients with clinically apparent unilateral pseudoexfoliation and elevated intraocular pressure. Patients were placed in a face-up position for 5 minutes, at which time anterior chamber depth and axial length were measured by A scan, and intraocular pressure was measured by Tonopen (Oculab, La Jolla, CA) in both eyes. The measurements were repeated on both eyes after 5 minutes in a face-down position.\nNo significant differences in intraocular pressure or axial length between the prone and supine positions were found in either eye. Anterior chamber depth in eyes with pseudoexfoliation decreased from a mean of 3.08 mm in the supine position to a mean of 2.95 mm in the prone position, whereas mean anterior chamber depth in the fellow eyes decreased from 3.01 mm to 2.97 mm. The decrease in anterior chamber depth when facing down in the eyes with pseudoexfoliation was significantly greater than in the fellow eyes.\nQuestion: Does head positioning influence anterior chamber depth in pseudoexfoliation syndrome?\nAnswer: yes\n\nAbstract: The aim of this study was to describe the evolution and epidemiologic characteristics of shigellosis patients over a 25 year period in a large city.\nShigellosis is a notifiable disease in Spain since 1988. Cases are analyzed in Barcelona residents included in the registry between 1988-2012. A descriptive analysis by sex, age, mode of transmission and Shigella species is presented. Trend analysis and time series were performed.\nOf the 559 cases analyzed, 60.15% were males. A sustained increase was observed in the trend since 2008 in males (p<0,05), especially at the expense of males who had no history of food poisoning or travel to endemic areas. The increasing tendency was greater in males from 21 to 60 years, both for S. flexneri (since 2009), and for S. sonnei (since 2004). In 2012 it was noted that in the men with S. flexneri, the 63% were men who have sex with men.\nQuestion: Analysis of the epidemiological pattern of Shigellosis in Barcelona between 1988 and 2012: Is it an emerging sexually transmitted infection?\nAnswer: yes\n\nAbstract: To investigate the role of human T-lymphotrophic virus type I (HTLV-I) infection in four patients who developed slowly progressive myelopathy with abnormal MRI lesions in the cervical cord levels.\nClinical and neuroradiologic examinations were performed, and the odds that an HTLV-I-infected individual of specified genotype, age, and provirus load had HTLV-I-associated myelopathy (HAM)/tropical spastic paraparesis (TSP) were calculated.\nAnti-HTLV-I antibodies were positive in both the serum and the CSF in all of the patients. Biopsied sample from spinal cord lesions showed inflammatory changes in Patient 1. Patient 2 had a demyelinating type of sensorimotor polyneuropathy. Two of the three patients examined showed high risk of developing HAM/TSP in virologic and immunologic aspects.\nQuestion: Chronic progressive cervical myelopathy with HTLV-I infection: Variant form of HAM/TSP?\nAnswer: yes\n\nAbstract: The transanal endorectal pull-through (TERPT) is becoming the most popular procedure in the treatment of Hirschsprung disease (HD), but overstretching of the anal sphincters remains a critical issue that may impact the continence. This study examined the long-term outcome of TERPT versus conventional transabdominal (ABD) pull-through for HD.\nRecords of 41 patients more than 3 years old who underwent a pull-through for HD (TERPT, n = 20; ABD, n = 21) were reviewed, and their families were thoroughly interviewed and scored via a 15-item post-pull-through long-term outcome questionnaire. Patients were operated on between the years 1995 and 2003. During this time, our group transitioned from the ABD to the TERPT technique. Total scoring ranged from 0 to 40: 0 to 10, excellent; 11 to 20 good; 21 to 30 fair; 31 to 40 poor. A 2-tailed Student t test, analysis of covariance, as well as logistic and linear regression were used to analyze the collected data with confidence interval higher than 95%.\nOverall scores were similar. However, continence score was significantly better in the ABD group, and the stool pattern score was better in the TERPT group. A significant difference in age at interview between the 2 groups was noted; we therefore reanalyzed the data controlling for age, and this showed that age did not significantly affect the long-term scoring outcome between groups.\nQuestion: Are the long-term results of the transanal pull-through equal to those of the transabdominal pull-through?\nAnswer:", " yes"], "resps": [[[-1.1190098524093628, true]], [[-1.6553257703781128, false]], [[-9.717825889587402, false], [-9.717825889587402, false], [-9.717825889587402, false], [-9.717825889587402, false]]], "filtered_resps": [[-1.1190098524093628, true], [-1.6553257703781128, false], [-9.717825889587402, false]], "acc": 0.0} {"doc_id": 4, "doc": {"pubid": 10808977, "question": "Can tailored interventions increase mammography use among HMO women?", "context": {"contexts": ["Telephone counseling and tailored print communications have emerged as promising methods for promoting mammography screening. However, there has been little research testing, within the same randomized field trial, of the efficacy of these two methods compared to a high-quality usual care system for enhancing screening. This study addressed the question: Compared to usual care, is tailored telephone counseling more effective than tailored print materials for promoting mammography screening?", "Three-year randomized field trial.", "One thousand ninety-nine women aged 50 and older recruited from a health maintenance organization in North Carolina.", "Women were randomized to 1 of 3 groups: (1) usual care, (2) tailored print communications, and (3) tailored telephone counseling.", "Adherence to mammography screening based on self-reports obtained during 1995, 1996, and 1997.", "Compared to usual care alone, telephone counseling promoted a significantly higher proportion of women having mammograms on schedule (71% vs 61%) than did tailored print (67% vs 61%) but only after the first year of intervention (during 1996). Furthermore, compared to usual care, telephone counseling was more effective than tailored print materials at promoting being on schedule with screening during 1996 and 1997 among women who were off-schedule during the previous year."], "labels": ["BACKGROUND", "DESIGN", "PARTICIPANTS", "INTERVENTION", "MAIN OUTCOME", "RESULTS"], "meshes": ["Cost-Benefit Analysis", "Female", "Health Maintenance Organizations", "Humans", "Logistic Models", "Mammography", "Marketing of Health Services", "Middle Aged", "North Carolina", "Odds Ratio", "Pamphlets", "Patient Acceptance of Health Care", "Patient Satisfaction", "Reminder Systems", "Telephone"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["n", "o"]}, "long_answer": "The effects of the intervention were most pronounced after the first intervention. Compared to usual care, telephone counseling seemed particularly effective at promoting change among nonadherent women, the group for whom the intervention was developed. These results suggest that telephone counseling, rather than tailored print, might be the preferred first-line intervention for getting nonadherent women on schedule for mammography screening. Many questions would have to be answered about why the tailored print intervention was not more powerful. Nevertheless, it is clear that additional interventions will be needed to maintain women's adherence to mammography. Medical Subject Headings (MeSH): mammography screening, telephone counseling, tailored print communications, barriers.", "final_decision": "yes"}, "target": "yes", "arguments": ["Abstract: In recent years the role of trace elements in lithogenesis has received steadily increasing attention.\nThis study was aimed to attempt to find the correlations between the chemical content of the stones and the concentration of chosen elements in the urine and hair of stone formers.\nThe proposal for the study was approved by the local ethics committee. Specimens were taken from 219 consecutive stone-formers. The content of the stone was evaluated using atomic absorption spectrometry, spectrophotometry, and colorimetric methods. An analysis of 29 elements in hair and 21 elements in urine was performed using inductively coupled plasma-atomic emission spectrometry.\nOnly a few correlations between the composition of stones and the distribution of elements in urine and in hair were found. All were considered incidental.\nQuestion: Can we predict urinary stone composition based on an analysis of microelement concentration in the hair and urine?\nAnswer: no\n\nAbstract: Refusal of patients to participate in intervention programs is an important problem in clinical trials but, in general, researchers devote relatively little attention to it. In this article, a comparison is made between patients who, after having been invited, agreed to participate in a self-management intervention (participants) and those who refused (refusers). Compared with other studies of refusers, relatively more information could be gathered with regard to both their characteristics and reasons for refusing, because all potential participants were invited personally.\nOlder patients from a Dutch outpatient clinic were invited to participate in a self-management intervention, and their characteristics were assessed. Demographic data were collected, as well as data on physical functioning and lack of emotional support. People who refused to participate were asked to give their reasons for refusing.\nOf the 361 patients invited, 267 (74%) refused participation. These refusers were more restricted in their mobility, lived further away from the location of the intervention, and had a partner more often than did the participants. No differences were found in level of education, age or gender. The main reasons given by respondents for refusing to participate were lack of time, travel distance, and transport problems.\nQuestion: Do older patients who refuse to participate in a self-management intervention in the Netherlands differ from older patients who agree to participate?\nAnswer: yes\n\nAbstract: Studies have identified clinical predictors to guide radiologic evaluation of the cervical spine in geriatric patients. We hypothesized that clinical predictors are not adequate in the identification of cervical spine fractures in geriatric blunt trauma patients with low-energy mechanism.\nA retrospective case-control study was performed on geriatric blunt trauma patients sustaining low-energy trauma from January 2000 to January 2006. A data form including 8 clinical predictors was completed for each group.\nThere were 35 study and 64 control patients identified. Both groups were similar in age (study 83.6 vs control 81.2) and injury severity score (study 9.06 vs control 9.61). Only neck tenderness exceeded the expected occurrence in the presence of a cervical spine injury (chi(2) = 18.1, P = .001) in just 45.5% of the study group.\nQuestion: Cervical spine fractures in geriatric blunt trauma patients with low-energy mechanism: are clinical predictors adequate?\nAnswer: no\n\nAbstract: Recently, increasing number of literature has identified the posterior tibial slope (PTS) as one of the risk factors of primary anterior cruciate ligament (ACL) injury. However, few studies concerning the association between failure of ACL reconstruction (ACLR) and PTS have been published. The objective of this study was to explore the association between the failure of ACLR and PTS at a minimum of two years follow-up.\nTwo hundred and thirty eight eligible patients from June 2009 to October 2010 were identified from our database. A total of 20 failure cases of ACLR and 20 randomly selected controls were included in this retrospective study. The demographic data and the results of manual maximum side-to-side difference with KT-1000 arthrometer at 30° of knee flexion and pivot-shift test before the ACLR and at the final follow-up were collected. The medial and lateral PTSs were measured using the magnetic resonance imaging (MRI) scan, based on Hudek's measurement. A comparison of PTS between the two groups was performed.\nThe overall failure rate of the present study was 8.4%. Of the 40 participants, the mean medial PTS was 4.1° ± 3.2° and the mean lateral PTS was 4.6° ± 2.6°. The medial PTS of the ACLR failure group was significantly steeper than the control group (3.5° ± 2.5° vs. 6.1° ± 2.1°, P = 0.000). Similarly, the lateral PTS of the ACLR failure group was significantly steeper than the control group (2.9° ± 2.1° vs. 5.5° ± 3.0°, P = 0.006). For medial PTS ≥ 5°, the odds ratio of ACLR failure was 6.8 (P = 0.007); for lateral PTS ≥5°, the odds ratio of ACLR failure was 10.8 (P = 0.000).\nQuestion: Are failures of anterior cruciate ligament reconstruction associated with steep posterior tibial slopes?\nAnswer: yes\n\nAbstract: Telephone counseling and tailored print communications have emerged as promising methods for promoting mammography screening. However, there has been little research testing, within the same randomized field trial, of the efficacy of these two methods compared to a high-quality usual care system for enhancing screening. This study addressed the question: Compared to usual care, is tailored telephone counseling more effective than tailored print materials for promoting mammography screening?\nThree-year randomized field trial.\nOne thousand ninety-nine women aged 50 and older recruited from a health maintenance organization in North Carolina.\nWomen were randomized to 1 of 3 groups: (1) usual care, (2) tailored print communications, and (3) tailored telephone counseling.\nAdherence to mammography screening based on self-reports obtained during 1995, 1996, and 1997.\nCompared to usual care alone, telephone counseling promoted a significantly higher proportion of women having mammograms on schedule (71% vs 61%) than did tailored print (67% vs 61%) but only after the first year of intervention (during 1996). Furthermore, compared to usual care, telephone counseling was more effective than tailored print materials at promoting being on schedule with screening during 1996 and 1997 among women who were off-schedule during the previous year.\nQuestion: Can tailored interventions increase mammography use among HMO women?\nAnswer:", " yes"], "resps": [[[-1.5089194774627686, false]], [[-1.9235846996307373, false]], [[-9.15413761138916, false], [-9.15413761138916, false], [-9.15413761138916, false], [-9.15413761138916, false]]], "filtered_resps": [[-1.5089194774627686, false], [-1.9235846996307373, false], [-9.15413761138916, false]], "acc": 1.0} {"doc_id": 5, "doc": {"pubid": 23831910, "question": "Double balloon enteroscopy: is it efficacious and safe in a community setting?", "context": {"contexts": ["From March 2007 to January 2011, 88 DBE procedures were performed on 66 patients. Indications included evaluation anemia/gastrointestinal bleed, small bowel IBD and dilation of strictures. Video-capsule endoscopy (VCE) was used prior to DBE in 43 of the 66 patients prior to DBE evaluation.", "The mean age was 62 years. Thirty-two patients were female, 15 were African-American; 44 antegrade and 44 retrograde DBEs were performed. The mean time per antegrade DBE was 107.4±30.0 minutes with a distance of 318.4±152.9 cm reached past the pylorus. The mean time per lower DBE was 100.7±27.3 minutes with 168.9±109.1 cm meters past the ileocecal valve reached. Endoscopic therapy in the form of electrocautery to ablate bleeding sources was performed in 20 patients (30.3%), biopsy in 17 patients (25.8%) and dilation of Crohn's-related small bowel strictures in 4 (6.1%). 43 VCEs with pathology noted were performed prior to DBE, with findings endoscopically confirmed in 32 cases (74.4%). In 3 cases the DBE showed findings not noted on VCE."], "labels": ["METHODS", "RESULTS"], "meshes": ["Community Health Centers", "Double-Balloon Enteroscopy", "Female", "Humans", "Intestinal Diseases", "Male", "Middle Aged"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["y", "e", "s"]}, "long_answer": "DBE appears to be equally safe and effective when performed in the community setting as compared to a tertiary referral center with a comparable yield, efficacy, and complication rate.", "final_decision": "yes"}, "target": "yes", "arguments": ["Abstract: This study aimed to show the relationship between serum paraoxonase 1 level and the epicardial fat tissue thickness.\nTwo hundred and seven patients without any atherosclerotic disease history were included in this cross-sectional observational study. Correlation analysis was performed to determine the correlation between epicardial fat tissue thickness, which was measured by echocardiography and serum paraoxonase 1 level. Also correlation analysis was performed to show correlation between patients' clinical and laboratory findings and the level of serum paraoxonase 1 (PON 1) and the epicardial fat tissue thickness. Pearson and Spearman test were used for correlation analysis.\nNo linear correlation between epicardial fat tissue thickness and serum PON 1 found (correlation coefficient: -0.127, p=0.069). When epicardial fat tissue thickness were grouped as 7 mm and over, and below, and 5 mm and over, and below, serum PON 1 level were significantly lower in ≥7 mm group (PON1 : 168.9 U/L) than<7 mm group (PON 1: 253.9 U/L) (p<0.001). Also hypertension prevalence was increased in ≥7 mm group (p=0.001). Serum triglyceride was found to be higher in ≥7 mm group (p=0.014), body mass index was found higher in ≥5 mm group (p=0.006).\nQuestion: Is there a relationship between serum paraoxonase level and epicardial fat tissue thickness?\nAnswer: no\n\nAbstract: In primary and secondary prevention trials, statins have been shown to reduce the risk of stroke. In addition to lipid lowering, statins have a number of antiatherothrombotic and neuroprotective properties. In a preliminary observational study, we explored whether clinical outcome is improved in patients who are on treatment with statins when stroke occurs.\nWe conducted a population-based case-referent study of 25- to 74-year-old stroke patients with, for each case of a patient who was on statin treatment at the onset of stroke (n=125), 2 referent patients who were not treated with statins but were matched for age, gender, year of onset, and stroke subtype (n=250).\nThe unadjusted odds ratio for early discharge to home (versus late discharge or death) was 1.41 (95% CI 0.91 to 2.17) when patients on statin treatment were compared with referent stroke patients not on statins. Prognostic factors were, in general, more unfavorable among patients on statins. When this was adjusted for in a logistic regression model, the use of statins was a moderately strong but statistically nonsignificant predictor of discharge to home (multiple-adjusted odds ratio 1.42, 95% CI 0.90 to 2.22).\nQuestion: Does pretreatment with statins improve clinical outcome after stroke?\nAnswer: no\n\nAbstract: Blood stream infection (BSI) and the subsequent development of sepsis are among the most common infection complications occurring in severe burn patients. This study was designed to evaluate the relationship between the burn wound flora and BSI pathogens.\nDocumentation of all bacterial and fungal wound and blood isolates from severe burn patients hospitalized in the burn unit and intensive care unit was obtained from medical records retrieved retrospectively from a computerized, hospital-wide database over a 13-year period. All data were recorded in relation to the Ryan score.\nOf 195 severe burn patients, 88 had at least 1 BSI episode. Transmission of the same pathogen from wound to blood was documented in 30% of the patients, with a rising BSI frequency as the Ryan score increased. There were a total of 263 bacteremic episodes in 88 study patients, 44% of blood isolates were documented previously in wound cultures, and transmission of the same pathogen from wound to blood was noted in 65% of bacteremic patients.\nQuestion: Do Wound Cultures Give Information About the Microbiology of Blood Cultures in Severe Burn Patients?\nAnswer: yes\n\nAbstract: Gallbladder carcinoma is characterized by delayed diagnosis, ineffective treatment and poor prognosis. Surgical resection has been thought to be the treatment of choice, while the role of radiotherapy as adjuvant or palliative treatment has not been fully clarified in the literature.\nWe present the case of a 45-year-old female, with unresectable gallbladder carcinoma, grade IV, histologically diagnosed during laparotomy. The patient was treated with palliative intent with percutaneous transhepatic biliary drainage. Furthermore, she received external radiotherapy by (60)Co, using a three-field technique (anterior-posterior and right lateral). The total dose was 3,000 cGy in 10 fractions, with 300 cGy per fraction, 5 days weekly.\nThe patient showed clinico-laboratory improvement and was discharged with a permanent percutaneous transhepatic endoprosthesis. During follow-up (10 and 12 months postirradiation), abdominal CTs showed no local extension of the tumor, while the patient had a good performance status. So far, 1 year after the diagnosis of gallbladder cancer she is still alive.\nQuestion: Is external palliative radiotherapy for gallbladder carcinoma effective?\nAnswer: yes\n\nAbstract: From March 2007 to January 2011, 88 DBE procedures were performed on 66 patients. Indications included evaluation anemia/gastrointestinal bleed, small bowel IBD and dilation of strictures. Video-capsule endoscopy (VCE) was used prior to DBE in 43 of the 66 patients prior to DBE evaluation.\nThe mean age was 62 years. Thirty-two patients were female, 15 were African-American; 44 antegrade and 44 retrograde DBEs were performed. The mean time per antegrade DBE was 107.4±30.0 minutes with a distance of 318.4±152.9 cm reached past the pylorus. The mean time per lower DBE was 100.7±27.3 minutes with 168.9±109.1 cm meters past the ileocecal valve reached. Endoscopic therapy in the form of electrocautery to ablate bleeding sources was performed in 20 patients (30.3%), biopsy in 17 patients (25.8%) and dilation of Crohn's-related small bowel strictures in 4 (6.1%). 43 VCEs with pathology noted were performed prior to DBE, with findings endoscopically confirmed in 32 cases (74.4%). In 3 cases the DBE showed findings not noted on VCE.\nQuestion: Double balloon enteroscopy: is it efficacious and safe in a community setting?\nAnswer:", " yes"], "resps": [[[-0.7459797263145447, true]], [[-1.4071965217590332, false]], [[-9.121963500976562, false], [-9.121963500976562, false], [-9.121963500976562, false], [-9.121963500976562, false]]], "filtered_resps": [[-0.7459797263145447, true], [-1.4071965217590332, false], [-9.121963500976562, false]], "acc": 1.0} {"doc_id": 6, "doc": {"pubid": 26037986, "question": "30-Day and 1-year mortality in emergency general surgery laparotomies: an area of concern and need for improvement?", "context": {"contexts": ["Emergency surgery is associated with poorer outcomes and higher mortality with recent studies suggesting the 30-day mortality to be 14-15%. The aim of this study was to analyse the 30-day mortality, age-related 30-day mortality and 1-year mortality following emergency laparotomy. We hope this will encourage prospective data collection, improvement of care and initiate strategies to establish best practice in this area.", "This was a retrospective study of patients who underwent emergency laparotomy from June 2010 to May 2012. The primary end point of the study was 30-day mortality, age-related 30-day mortality and 1-year all-cause mortality.", "477 laparotomies were performed in 446 patients. 57% were aged<70 and 43% aged>70 years. 30-day mortality was 12, 4% in those aged<70 years and 22% in those>70 years (p<0.001). 1-year mortality was 25, 15% in those aged under 70 years and 38% in those aged>70 years (p<0.001)."], "labels": ["AIMS", "METHODS", "RESULTS"], "meshes": ["Adult", "Age Factors", "Aged", "Aged, 80 and over", "Cause of Death", "Cohort Studies", "Emergency Treatment", "Female", "General Surgery", "Humans", "Incidence", "Laparotomy", "Male", "Middle Aged", "Needs Assessment", "Retrospective Studies", "Risk Assessment", "Time Factors", "United Kingdom"], "reasoning_required_pred": ["m", "a", "y", "b", "e"], "reasoning_free_pred": ["y", "e", "s"]}, "long_answer": "Emergency laparotomy carries a high rate of mortality, especially in those over the age of 70 years, and more needs to be done to improve outcomes, particularly in this group. This could involve increasing acute surgical care manpower, early recognition of patients requiring emergency surgery, development of clear management protocols for such patients or perhaps even considering centralisation of emergency surgical services to specialist centres with multidisciplinary teams involving emergency surgeons and care of the elderly physicians in hospital and related community outreach services for post-discharge care.", "final_decision": "maybe"}, "target": "maybe", "arguments": ["Abstract: Several studies have suggested a protective effect of folic acid (FA) on congenital heart anomalies. Down syndrome (DS) infants are known to have a high frequency of heart anomalies. Not all children with DS suffer from heart anomalies, which raises the question whether maternal factors might affect the risk of these anomalies. Our objectives were to investigate whether first-trimester FA use protects against heart anomalies among DS children.\nWomen with liveborn DS children participating in the Slone Epidemiology Center Birth Defects Study between 1976 and 1997 were included. We performed case-control analyses using DS, with heart anomalies as cases and DS, without heart anomalies as controls. Subanalyses were performed for defects that have been associated with FA in non-DS populations (conotruncal, ventricular septal [VSD]) and for those that are associated with DS (ostium secundum type atrial septal defects [ASD]and endocardial cushion defects [ECD]). Exposure was defined as the use of any FA-containing product for an average of at least 4 days per week during the first 12 weeks of pregnancy, whereas no exposure was defined as no use of FA in these 12 weeks.\nOf the 223 cases, 110 (49%) were exposed versus 84 (46%) of the 184 controls. After adjustment for possible confounders, no protective effect of FA was found on heart anomalies overall (OR 0.95, 95% CI: 0.61-1.47) nor separately for conotruncal defects, VSDs, ASDs, or ECDs.\nQuestion: Can folic acid protect against congenital heart defects in Down syndrome?\nAnswer: no\n\nAbstract: The use of open access endoscopy is increasing. Its effect on the adequacy of patient informed consent, procedure acceptance and the impact on subsequent communication/transfer of procedure results to the patient have not been evaluated. The aim of our study was to compare the extent of preknowledge of procedures and test explanation, patient medical complexity, information transfer and overall patient satisfaction between a patient group referred for outpatient open access endoscopy versus a patient group from a gastrointestinal (GI) subspecialty clinic.\nInformation was obtained from all patients presenting for outpatient upper and lower endoscopy by using a 1-page questionnaire. Patients from the two groups who had an outpatient upper/lower endoscopic procedure were contacted by phone after the procedure to obtain information with a standardized questionnaire.\nThe open access patients reported receiving significantly less information to help them identify the procedure (p<0.01) and less explanation concerning the nature of the procedure than the group of patients referred from the subspecialty clinic (p<0.005). There was no difference between the two groups in satisfaction scores for examinations performed under conscious sedation. For flexible sigmoidoscopy without sedation, however, the GI clinic patient group were more satisfied with their procedure. The majority of patients, regardless of access, were more likely to receive endoscopic results from a gastroenterologist than the referring physician. Furthermore, the patients in the GI clinic group who underwent colonoscopy felt significantly better at follow-up.\nQuestion: Does open access endoscopy close the door to an adequately informed patient?\nAnswer: yes\n\nAbstract: Treatment delays in breast cancer are generally thought to affect prognosis but the impact on survival remains unclear. Indicators for breast cancer care include time to primary treatment. The purpose of this study was to evaluate whether time to primary treatment (TPT) in breast cancer impacts survival.\nA total of 648 breast cancer patients treated in the University Malaya Medical Center (UMMC), Malaysia between 2004 and 2005 were included in the study. TPT was calculated from the date of pathological diagnosis to the date of primary treatment. Mortality data was obtained from the National Registry of Births and Deaths. Last date of follow-up was November 2010.\nMedian TPT was 18 days. Majority 508 (69.1%) of the patients received treatment within 30 days after diagnosis. The majority was surgically treated. Ethnicity (p=0.002) and stage at presentation (p=0.007) were significantly associated with delayed TPT. Malay ethnicity had delayed TPT compared to the Chinese; Hazard Ratio (HR) 1.9 (Confidence Interval (CI) 1.237, 2.987). Delayed TPT did not affect overall survival on univariate and multivariate analyses.\nQuestion: Delays in time to primary treatment after a diagnosis of breast cancer: does it impact survival?\nAnswer: no\n\nAbstract: To evaluate the outcome of a new modification of percutaneous needle suspension, using a bone anchor system for fixing the suture at the public bone, and to compare the results with those published previously.\nFrom March 1996, 37 patients with stress urinary incontinence (>2 years) were treated using a bone anchor system. On each side the suture was attached to the pubocervical fascia and the vaginal wall via a broad 'Z'-stitch. A urodynamic investigation performed preoperatively in all patients confirmed stress incontinence and excluded detrusor instability. The outcome was assessed by either by a clinical follow-up investigation or using a standardized questionnaire, over a mean follow-up of 11 months (range 6-18).\nIn the 37 patients, the procedure was successful in 25 (68%), with 16 (43%) of the patients completely dry and nine (24%) significantly improved. Removal of the bone anchor and suture was necessary in two patients, because of unilateral bacterial infection in one and a bilateral soft tissue granuloma in the other. One bone anchor became dislocated in a third patient. In two cases where the treatment failed, new detrusor instability was documented urodynamically. Minor complications were prolonged wound pain in 10 (26%) and transient urinary retention or residual urine in 12 patients (32%).\nQuestion: Does bone anchor fixation improve the outcome of percutaneous bladder neck suspension in female stress urinary incontinence?\nAnswer: yes\n\nAbstract: Emergency surgery is associated with poorer outcomes and higher mortality with recent studies suggesting the 30-day mortality to be 14-15%. The aim of this study was to analyse the 30-day mortality, age-related 30-day mortality and 1-year mortality following emergency laparotomy. We hope this will encourage prospective data collection, improvement of care and initiate strategies to establish best practice in this area.\nThis was a retrospective study of patients who underwent emergency laparotomy from June 2010 to May 2012. The primary end point of the study was 30-day mortality, age-related 30-day mortality and 1-year all-cause mortality.\n477 laparotomies were performed in 446 patients. 57% were aged<70 and 43% aged>70 years. 30-day mortality was 12, 4% in those aged<70 years and 22% in those>70 years (p<0.001). 1-year mortality was 25, 15% in those aged under 70 years and 38% in those aged>70 years (p<0.001).\nQuestion: 30-Day and 1-year mortality in emergency general surgery laparotomies: an area of concern and need for improvement?\nAnswer:", " yes"], "resps": [[[-0.7148197293281555, true]], [[-1.489363431930542, false]], [[-9.937323570251465, false], [-9.937323570251465, false], [-9.937323570251465, false], [-9.937323570251465, false]]], "filtered_resps": [[-0.7148197293281555, true], [-1.489363431930542, false], [-9.937323570251465, false]], "acc": 0.0} {"doc_id": 7, "doc": {"pubid": 26852225, "question": "Is adjustment for reporting heterogeneity necessary in sleep disorders?", "context": {"contexts": ["Anchoring vignettes are brief texts describing a hypothetical character who illustrates a certain fixed level of a trait under evaluation. This research uses vignettes to elucidate factors associated with sleep disorders in adult Japanese before and after adjustment for reporting heterogeneity in self-reports. This study also evaluates the need for adjusting for reporting heterogeneity in the management of sleep and energy related problems in Japan.", "We investigated a dataset of 1002 respondents aged 18 years and over from the Japanese World Health Survey, which collected information through face-to-face interview from 2002 to 2003. The ordered probit model and the Compound Hierarchical Ordered Probit (CHOPIT) model, which incorporated anchoring vignettes, were employed to estimate and compare associations of sleep and energy with socio-demographic and life-style factors before and after adjustment for differences in response category cut-points for each individual.", "The prevalence of self-reported problems with sleep and energy was 53 %. Without correction of cut-point shifts, age, sex, and the number of comorbidities were significantly associated with a greater severity of sleep-related problems. After correction, age, the number of comorbidities, and regular exercise were significantly associated with a greater severity of sleep-related problems; sex was no longer a significant factor. Compared to the ordered probit model, the CHOPIT model provided two changes with a subtle difference in the magnitude of regression coefficients after correction for reporting heterogeneity."], "labels": ["BACKGROUND", "METHODS", "RESULTS"], "meshes": ["Adult", "Aged", "Female", "Health Status Disparities", "Health Surveys", "Humans", "Japan", "Male", "Middle Aged", "Physical Fitness", "Prevalence", "Self Report", "Self-Assessment", "Sleep Wake Disorders", "Socioeconomic Factors"], "reasoning_required_pred": ["y", "e", "s"], "reasoning_free_pred": ["n", "o"]}, "long_answer": "Sleep disorders are common in the general adult population of Japan. Correction for reporting heterogeneity using anchoring vignettes is not a necessary tool for proper management of sleep and energy related problems among Japanese adults. Older age, gender differences in communicating sleep-related problems, the presence of multiple morbidities, and regular exercise should be the focus of policies and clinical practice to improve sleep and energy management in Japan.", "final_decision": "no"}, "target": "no", "arguments": ["Abstract: To determine if composite measures based on process indicators are consistent with short-term outcome indicators in surgical colorectal cancer care.\nLongitudinal analysis of consistency between composite measures based on process indicators and outcome indicators for 85 Dutch hospitals.\nThe Dutch Surgical Colorectal Audit database, the Netherlands.\n4732 elective patients with colon carcinoma and 2239 with rectum carcinoma treated in 85 hospitals were included in the analyses.\nAll available process indicators were aggregated into five different composite measures. The association of the different composite measures with risk-adjusted postoperative mortality and morbidity was analysed at the patient and hospital level.\nAt the patient level, only one of the composite measures was negatively associated with morbidity for rectum carcinoma. At the hospital level, a strong negative association was found between composite measures and hospital mortality and morbidity rates for rectum carcinoma (p<0.05), and hospital morbidity rates for colon carcinoma.\nQuestion: Combining process indicators to evaluate quality of care for surgical patients with colorectal cancer: are scores consistent with short-term outcome?\nAnswer: maybe\n\nAbstract: Human immunodeficiency virus (HIV)-infected patients have generally been excluded from transplantation. Recent advances in the management and prognosis of these patients suggest that this policy should be reevaluated.\nTo explore the current views of U.S. transplant centers toward transplanting asymptomatic HIV-infected patients with end-stage renal disease, a written survey was mailed to the directors of transplantation at all 248 renal transplant centers in the United States.\nAll 148 responding centers said they require HIV testing of prospective kidney recipients, and 84% of these centers would not transplant an individual who refuses HIV testing. The vast majority of responding centers would not transplant a kidney from a cadaveric (88%) or a living donor (91%) into an asymptomatic HIV-infected patient who is otherwise a good candidate for transplantation. Among the few centers that would consider transplanting an HIV-infected patient, not a single center had performed such a transplant in the year prior to the survey. Most centers fear that transplantation in the face of HIV infection would be harmful to the individual, and some believe that it would be a waste of precious organs.\nQuestion: Should all human immunodeficiency virus-infected patients with end-stage renal disease be excluded from transplantation?\nAnswer: no\n\nAbstract: In a prospective study 218 preschool children were enrolled (stratified in 2 training programs, one specialized for phonologic awareness in order to prevent dyslexia, the other consisting in training of general perception) during the last year of kindergarten. After finishing the first grade 131 children were compared in their reading and writing abilities.\nIn the whole group only a slight difference was found between both training modalities concerning their writing abilities. However, children with a history of hearing loss, actual hearing loss or pathologic middle ear findings profited most from the specialized training program compared to the control in their reading abilities.\nQuestion: Is a specialised training of phonological awareness indicated in every preschool child?\nAnswer: maybe\n\nAbstract: Epidemiological data show significant associations of vitamin D deficiency and autoimmune diseases. Vitamin D may prevent autoimmunity by stimulating naturally occurring regulatory T cells.\nTo elucidate whether vitamin D supplementation increases Tregs frequency (%Tregs) within circulating CD4+ T cells.\nWe performed an uncontrolled vitamin D supplementation trial among 50 apparently healthy subjects including supplementation of 140,000 IU at baseline and after 4 weeks (visit 1). The final follow-up visit was performed 8 weeks after the baseline examination (visit 2). Blood was drawn at each study visit to determine 25-hydroxyvitamin D levels and %Tregs. Tregs were characterized as CD4+CD25++ T cells with expression of the transcription factor forkhead box P3 and low or absent expression of CD127.\nForty-six study participants (65% females, mean age +/- SD 31 +/- 8 years) completed the trial. 25(OH)D levels increased from 23.9 +/- 12.9 ng/ml at baseline to 45.9 +/- 14.0 ng/ml at visit 1 and 58.0 +/- 15.1 ng/ml at visit 2. %Tregs at baseline were 4.8 +/- 1.4. Compared to baseline levels we noticed a significant increase of %Tregs at study visit 1 (5.9 +/- 1.7, P<0.001) and 2 (5.6 +/- 1.6, P<0.001).\nQuestion: Vitamin D supplementation and regulatory T cells in apparently healthy subjects: vitamin D treatment for autoimmune diseases?\nAnswer: yes\n\nAbstract: Anchoring vignettes are brief texts describing a hypothetical character who illustrates a certain fixed level of a trait under evaluation. This research uses vignettes to elucidate factors associated with sleep disorders in adult Japanese before and after adjustment for reporting heterogeneity in self-reports. This study also evaluates the need for adjusting for reporting heterogeneity in the management of sleep and energy related problems in Japan.\nWe investigated a dataset of 1002 respondents aged 18 years and over from the Japanese World Health Survey, which collected information through face-to-face interview from 2002 to 2003. The ordered probit model and the Compound Hierarchical Ordered Probit (CHOPIT) model, which incorporated anchoring vignettes, were employed to estimate and compare associations of sleep and energy with socio-demographic and life-style factors before and after adjustment for differences in response category cut-points for each individual.\nThe prevalence of self-reported problems with sleep and energy was 53 %. Without correction of cut-point shifts, age, sex, and the number of comorbidities were significantly associated with a greater severity of sleep-related problems. After correction, age, the number of comorbidities, and regular exercise were significantly associated with a greater severity of sleep-related problems; sex was no longer a significant factor. Compared to the ordered probit model, the CHOPIT model provided two changes with a subtle difference in the magnitude of regression coefficients after correction for reporting heterogeneity.\nQuestion: Is adjustment for reporting heterogeneity necessary in sleep disorders?\nAnswer:", " yes"], "resps": [[[-1.0884227752685547, true]], [[-1.1592693328857422, false]], [[-5.038801193237305, false], [-5.038801193237305, false], [-5.038801193237305, false], [-5.038801193237305, false]]], "filtered_resps": [[-1.0884227752685547, true], [-1.1592693328857422, false], [-5.038801193237305, false]], "acc": 0.0}