946 resultados para 08 Information and Computing Sciences
Resumo:
Background Access to health care can be described along four dimensions: geographic accessibility, availability, financial accessibility and acceptability. Geographic accessibility measures how physically accessible resources are for the population, while availability reflects what resources are available and in what amount. Combining these two types of measure into a single index provides a measure of geographic (or spatial) coverage, which is an important measure for assessing the degree of accessibility of a health care network. Results This paper describes the latest version of AccessMod, an extension to the Geographical Information System ArcView 3.×, and provides an example of application of this tool. AccessMod 3 allows one to compute geographic coverage to health care using terrain information and population distribution. Four major types of analysis are available in AccessMod: (1) modeling the coverage of catchment areas linked to an existing health facility network based on travel time, to provide a measure of physical accessibility to health care; (2) modeling geographic coverage according to the availability of services; (3) projecting the coverage of a scaling-up of an existing network; (4) providing information for cost effectiveness analysis when little information about the existing network is available. In addition to integrating travelling time, population distribution and the population coverage capacity specific to each health facility in the network, AccessMod can incorporate the influence of landscape components (e.g. topography, river and road networks, vegetation) that impact travelling time to and from facilities. Topographical constraints can be taken into account through an anisotropic analysis that considers the direction of movement. We provide an example of the application of AccessMod in the southern part of Malawi that shows the influences of the landscape constraints and of the modes of transportation on geographic coverage. Conclusion By incorporating the demand (population) and the supply (capacities of heath care centers), AccessMod provides a unifying tool to efficiently assess the geographic coverage of a network of health care facilities. This tool should be of particular interest to developing countries that have a relatively good geographic information on population distribution, terrain, and health facility locations.
Resumo:
Glaciers all over the world are expected to continue to retreat due to the global warming throughout the 21st century. Consequently, future seasonal water availability might become scarce once glacier areas have declined below a certain threshold affecting future water management strategies. Particular attention should be paid to glaciers located in a karstic environment, as parts of the meltwater can be drained by underlying karst systems, making it difficult to assess water availability. In this study tracer experiments, karst modeling and glacier melt modeling are combined in order to identify flow paths in a high alpine, glacierized, karstic environment (Glacier de la Plaine Morte, Switzerland) and to investigate current and predict future downstream water availability. Flow paths through the karst underground were determined with natural and fluorescent tracers. Subsequently, geologic information and the findings from tracer experiments were assembled in a karst model. Finally, glacier melt projections driven with a climate scenario were performed to discuss future water availability in the area surrounding the glacier. The results suggest that during late summer glacier meltwater is rapidly drained through well-developed channels at the glacier bottom to the north of the glacier, while during low flow season meltwater enters into the karst and is drained to the south. Climate change projections with the glacier melt model reveal that by the end of the century glacier melt will be significantly reduced in the summer, jeopardizing water availability in glacier-fed karst springs.
Resumo:
With improving clinical CT scanning technology, the accuracy of CT-based finite element (FE) models of the human skeleton may be ameliorated by an enhanced description of apparent level bone mechanical properties. Micro-finite element (μFE) modeling can be used to study the apparent elastic behavior of human cancellous bone. In this study, samples from the femur, radius and vertebral body were investigated to evaluate the predictive power of morphology–elasticity relationships and to compare them across different anatomical regions. μFE models of 701 trabecular bone cubes with a side length of 5.3 mm were analyzed using kinematic boundary conditions. Based on the FE results, four morphology–elasticity models using bone volume fraction as well as full, limited or no fabric information were calibrated for each anatomical region. The 5 parameter Zysset–Curnier model using full fabric information showed excellent predictive power with coefficients of determination ( r2adj ) of 0.98, 0.95 and 0.94 of the femur, radius and vertebra data, respectively, with mean total norm errors between 14 and 20%. A constant orthotropy model and a constant transverse isotropy model, where the elastic anisotropy is defined by the model parameters, yielded coefficients of determination between 0.90 and 0.98 with total norm errors between 16 and 25%. Neglecting fabric information and using an isotropic model led to r2adj between 0.73 and 0.92 with total norm errors between 38 and 49%. A comparison of the model regressions revealed minor but significant (p<0.01) differences for the fabric–elasticity model parameters calibrated for the different anatomical regions. The proposed models and identified parameters can be used in future studies to compute the apparent elastic properties of human cancellous bone for homogenized FE models.
Resumo:
BACKGROUND: Many users search the Internet for answers to health questions. Complementary and alternative medicine (CAM) is a particularly common search topic. Because many CAM therapies do not require a clinician's prescription, false or misleading CAM information may be more dangerous than information about traditional therapies. Many quality criteria have been suggested to filter out potentially harmful online health information. However, assessing the accuracy of CAM information is uniquely challenging since CAM is generally not supported by conventional literature. OBJECTIVE: The purpose of this study is to determine whether domain-independent technical quality criteria can identify potentially harmful online CAM content. METHODS: We analyzed 150 Web sites retrieved from a search for the three most popular herbs: ginseng, ginkgo and St. John's wort and their purported uses on the ten most commonly used search engines. The presence of technical quality criteria as well as potentially harmful statements (commissions) and vital information that should have been mentioned (omissions) was recorded. RESULTS: Thirty-eight sites (25%) contained statements that could lead to direct physical harm if acted upon. One hundred forty five sites (97%) had omitted information. We found no relationship between technical quality criteria and potentially harmful information. CONCLUSIONS: Current technical quality criteria do not identify potentially harmful CAM information online. Consumers should be warned to use other means of validation or to trust only known sites. Quality criteria that consider the uniqueness of CAM must be developed and validated.
Resumo:
Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.
Resumo:
OBJECTIVES: To determine the prevalence of false or misleading statements in messages posted by internet cancer support groups and whether these statements were identified as false or misleading and corrected by other participants in subsequent postings. DESIGN: Analysis of content of postings. SETTING: Internet cancer support group Breast Cancer Mailing List. MAIN OUTCOME MEASURES: Number of false or misleading statements posted from 1 January to 23 April 2005 and whether these were identified and corrected by participants in subsequent postings. RESULTS: 10 of 4600 postings (0.22%) were found to be false or misleading. Of these, seven were identified as false or misleading by other participants and corrected within an average of four hours and 33 minutes (maximum, nine hours and nine minutes). CONCLUSIONS: Most posted information on breast cancer was accurate. Most false or misleading statements were rapidly corrected by participants in subsequent postings.
Resumo:
People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.
Resumo:
A nonlinear viscoelastic image registration algorithm based on the demons paradigm and incorporating inverse consistent constraint (ICC) is implemented. An inverse consistent and symmetric cost function using mutual information (MI) as a similarity measure is employed. The cost function also includes regularization of transformation and inverse consistent error (ICE). The uncertainties in balancing various terms in the cost function are avoided by alternatively minimizing the similarity measure, the regularization of the transformation, and the ICE terms. The diffeomorphism of registration for preventing folding and/or tearing in the deformation is achieved by the composition scheme. The quality of image registration is first demonstrated by constructing brain atlas from 20 adult brains (age range 30-60). It is shown that with this registration technique: (1) the Jacobian determinant is positive for all voxels and (2) the average ICE is around 0.004 voxels with a maximum value below 0.1 voxels. Further, the deformation-based segmentation on Internet Brain Segmentation Repository, a publicly available dataset, has yielded high Dice similarity index (DSI) of 94.7% for the cerebellum and 74.7% for the hippocampus, attesting to the quality of our registration method.
Resumo:
BACKGROUND Rheumatic heart disease accounts for up to 250 000 premature deaths every year worldwide and can be regarded as a physical manifestation of poverty and social inequality. We aimed to estimate the prevalence of rheumatic heart disease in endemic countries as assessed by different screening modalities and as a function of age. METHODS We searched Medline, Embase, the Latin American and Caribbean System on Health Sciences Information, African Journals Online, and the Cochrane Database of Systematic Reviews for population-based studies published between Jan 1, 1993, and June 30, 2014, that reported on prevalence of rheumatic heart disease among children and adolescents (≥5 years to <18 years). We assessed prevalence of clinically silent and clinically manifest rheumatic heart disease in random effects meta-analyses according to screening modality and geographical region. We assessed the association between social inequality and rheumatic heart disease with the Gini coefficient. We used Poisson regression to analyse the effect of age on prevalence of rheumatic heart disease and estimated the incidence of rheumatic heart disease from prevalence data. FINDINGS We included 37 populations in the systematic review and meta-analysis. The pooled prevalence of rheumatic heart disease detected by cardiac auscultation was 2·9 per 1000 people (95% CI 1·7-5·0) and by echocardiography it was 12·9 per 1000 people (8·9-18·6), with substantial heterogeneity between individual reports for both screening modalities (I(2)=99·0% and 94·9%, respectively). We noted an association between social inequality expressed by the Gini coefficient and prevalence of rheumatic heart disease (p=0·0002). The prevalence of clinically silent rheumatic heart disease (21·1 per 1000 people, 95% CI 14·1-31·4) was about seven to eight times higher than that of clinically manifest disease (2·7 per 1000 people, 1·6-4·4). Prevalence progressively increased with advancing age, from 4·7 per 1000 people (95% CI 0·0-11·2) at age 5 years to 21·0 per 1000 people (6·8-35·1) at 16 years. The estimated incidence was 1·6 per 1000 people (0·8-2·3) and remained constant across age categories (range 2·5, 95% CI 1·3-3·7 in 5-year-old children to 1·7, 0·0-5·1 in 15-year-old adolescents). We noted no sex-related differences in prevalence (p=0·829). INTERPRETATION We found a high prevalence of rheumatic heart disease in endemic countries. Although a reduction in social inequalities represents the cornerstone of community-based prevention, the importance of early detection of silent rheumatic heart disease remains to be further assessed. FUNDING UBS Optimus Foundation.
Resumo:
Rainer Werner Fassbinder’s Garbage, The City, and Death. A Four Act Scandal in Post-war Germany The paper explores the dramaturgy of the scandals around the play Garbage, The City and Death (Der Müll, die Stadt und der Tod) by German playwright, theatre and film maker Rainer Werner Fassbinder. Published in 1976, the play immediately caused a scandal in West Germany, because it was accused of reproducing anti-Semitic stereotypes. The presentation sheds light on the different phases of the scandal and their historical and cultural contexts in post-war Germany – starting as a literary scandal in 1976, being transformed into a theatre scandal in the 1980ies and finally being dissolved by the German premiere in 2009. The paper is structured as follows: Act One: The Literary Scandal. Destroying Fassbinder’s Garbage, Act Two: Preventing the Staging of the Play, Act Three: Blocking the Opening Night, Act Four: Performing the Play in Germany. By analysing the dramaturgical structure of this specific scandal, the paper discusses the following hypotheses: 1. Scandals arise through the circulation of decontextualised information in public. This is due to either a lack of information about the actual object or incident being scandalised or a lack of information about the context of the object or incident. This lack is caused by the logic of the scandal itself: Because the play or the performance is prohibited, it has been withdrawn from the public, making it impossible to form a well-founded opinion on the controversy. 2. The scandal is driven forward by an emotionalising rhetoric built around the decontextualised information. 3. Once the gap of information is filled, the scandalising rhetoric turns into a rhetoric of irrelevance: Reviews of the first performance of Garbage, The City and Death in Germany considered the play hardly a matter of public concern.
Resumo:
XMapTools is a MATLAB©-based graphical user interface program for electron microprobe X-ray image processing, which can be used to estimate the pressure–temperature conditions of crystallization of minerals in metamorphic rocks. This program (available online at http://www.xmaptools.com) provides a method to standardize raw electron microprobe data and includes functions to calculate the oxide weight percent compositions for various minerals. A set of external functions is provided to calculate structural formulae from the standardized analyses as well as to estimate pressure–temperature conditions of crystallization, using empirical and semi-empirical thermobarometers from the literature. Two graphical user interface modules, Chem2D and Triplot3D, are used to plot mineral compositions into binary and ternary diagrams. As an example, the software is used to study a high-pressure Himalayan eclogite sample from the Stak massif in Pakistan. The high-pressure paragenesis consisting of omphacite and garnet has been retrogressed to a symplectitic assemblage of amphibole, plagioclase and clinopyroxene. Mineral compositions corresponding to ~165,000 analyses yield estimates for the eclogitic pressure–temperature retrograde path from 25 kbar to 9 kbar. Corresponding pressure–temperature maps were plotted and used to interpret the link between the equilibrium conditions of crystallization and the symplectitic microstructures. This example illustrates the usefulness of XMapTools for studying variations of the chemical composition of minerals and for retrieving information on metamorphic conditions on a microscale, towards computation of continuous pressure–temperature-and relative time path in zoned metamorphic minerals not affected by post-crystallization diffusion.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
The floods that occurred on the Aare and Rhine rivers in May 2015 and the mostly successful handling of this event in terms of flood protection measures are a good reminder of how important it is to comprehend the causes and processes involved in such natural hazards. While the needed data series of gauge measurements and peak discharge calculations reach back to the 19th century, historical records dating further back in time can provide additional and useful information to help understanding extreme flood events and to evaluate prevention measures such as river dams and corrections undertaken prior to instrumental measurements. In my PhD project I will use a wide range of historical sources to assess and quantify past extreme flood events. It is part of the SNF-funded project “Reconstruction of the Genesis, Process and Impact of Major Pre-instrumental Flood Events of Major Swiss Rivers Including a Peak Discharge Quantification” and will cover the research locations Fribourg (Saane R.), Burgdorf (Emme R.), Thun, Bern (both Aare R.), and the Lake of Constance at the locations Lindau, Constance and Rorschach. My main goals are to provide a long time series of quantitative data for extreme flood events, to discuss the occurring changes in these data, and to evaluate the impact of the aforementioned human influences on the drainage system. Extracting information given in account books from the towns of Basel and Solothurn may also enable me to assess the frequency and seasonality of less severe river floods. Finally, historical information will be used for remodeling the historical hydrological regime to homogenize the historical data series to modern day conditions and thus make it comparable to the data provided by instrumental measurements. The method I will apply for processing all information provided by historical sources such as chronicles, newspapers, institutional records, as well as flood marks, paintings and archeological evidence has been developed and successfully applied to the site of Basel by Wetter et al. (2011). They have also shown that data homogenization is possible by reconstructing previous stream flow conditions using historical river profiles and by carefully observing and re-constructing human changes of the river bed and its surroundings. Taken all information into account, peak discharges for past extreme flood events will be calculated with a one-dimensional hydrological model.
Resumo:
Abstract As librarians of the Social & Preventive Medicine Library in Bern, we help researchers perform systematic literature searches and teach students to use medical databases. We developed our skills mainly “on the job”, and we wondered how other health librarians in Europe were trained to become experts in searching. We had a great opportunity to “job shadow” specialists in this area of library service during a 5-day-internship at the Royal Free Hospital Medical Library in London, Great Britain.
Resumo:
Interleukin 4 (IL-4) is expected to play a dominant role in the development of T helper (Th) 2 cells. Th2 immune responses with expression of relatively large amounts of interleukin 4 (IL-4) but little interferon gamma (IFN-gamma) are characteristic for chronic helminth infections. But no information is available about IL4 expression during early Fasciola hepatica (F. hepatica) infections in cattle. Therefore, we investigated F. hepatica specific IL-4 and IFN-gamma mRNA expression in peripheral blood mononuclear cells (PBMCs) from calves experimentally infected with F. hepatica. Cells were collected prior to infection and on post-inoculation days (PIDs) 10, 28 and 70. Interestingly, PBMCs responded to stimulation with F. hepatica secretory-excretory products (FhSEP) already on PID 10 and expressed high amounts of IL-4 but not of IFN-gamma mRNA suggesting that F. hepatica induced a Th2 biased early immune response which was not restricted to the site of infection. Later in infection IL-4 mRNA expression decreased whereas IFN-gamma mRNA expression increased slightly. Isolated lymph node cells (LNCs) stimulated with FhSEP and, even more importantly, non-stimulated LN tissue samples indicated highly polarized Th2 type immune responses in the draining (hepatic) lymph node, but not in the retropharyngeal lymph node. During preliminary experiments, two splice variants of bovine IL-4 mRNA, boIL-4delta2 and boIL-4delta3, were detected. Since a human IL-4delta2 was assumed to act as competitive inhibitor of IL-4, it was important to know whether expression of these splice variants of bovine IL-4 have a regulatory function during an immune response to infection with F. hepatica. Indeed, IL-4 splice variants could be detected in a number of samples, but quantitative analysis did not yield any clue to their function. Therefore, the significance of bovine IL-4 splice variants remains to be determined.