925 resultados para advantages of networking


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The rapid further development of computed tomography (CT) and magnetic resonance imaging (MRI) induced the idea to use these techniques for postmortem documentation of forensic findings. Until now, only a few institutes of forensic medicine have acquired experience in postmortem cross-sectional imaging. Protocols, image interpretation and visualization have to be adapted to the postmortem conditions. Especially, postmortem alterations, such as putrefaction and livores, different temperature of the corpse and the loss of the circulation are a challenge for the imaging process and interpretation. Advantages of postmortem imaging are the higher exposure and resolution available in CT when there is no concern for biologic effects of ionizing radiation, and the lack of cardiac motion artifacts during scanning. CT and MRI may become useful tools for postmortem documentation in forensic medicine. In Bern, 80 human corpses underwent postmortem imaging by CT and MRI prior to traditional autopsy until the month of August 2003. Here, we describe the imaging appearance of postmortem alterations--internal livores, putrefaction, postmortem clotting--and distinguish them from the forensic findings of the heart, such as calcification, endocarditis, myocardial infarction, myocardial scarring, injury and other morphological alterations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Regime shifts, defined as a radical and persistent reconfiguration of an ecosystem following a disturbance, have been acknowledged by scientists as a very important aspect of the dynamic of ecosystems. However, their consideration in land management planning remains marginal and limited to specific processes and systems. Current research focuses on mathematical modeling and statistical analysis of spatio-temporal data for specific environmental variables. These methods do not fulfill the needs of land managers, who are confronted with a multitude of processes and pressure types and require clear and simple strategies to prevent regime shift or to increase the resilience of their environment. The EU-FP7 CASCADE project is looking at regime shifts of dryland ecosystems in southern Europe and specifically focuses on rangeland and forest systems which are prone to various land degradation threats. One of the aims of the project is to evaluate the impact of different management practices on the dynamic of the environment in a participatory manner, including a multi-stakeholder evaluation of the state of the environment and of the management potential. To achieve this objective we have organized several stakeholder meetings and we have compiled a review of management practices using the WOCAT methodology, which enables merging scientific and land users knowledge. We highlight here the main challenges we have encountered in applying the notion of regime shift to real world socio-ecological systems and in translating related concepts such as tipping points, stable states, hysteresis and resilience to land managers, using concrete examples from CASCADE study sites. Secondly, we explore the advantages of including land users’ knowledge in the scientific understanding of regime shifts. Moreover, we discuss useful alternative concepts and lessons learnt that will allow us to build a participatory method for the assessment of resilient management practices in specific socio-ecological systems and to foster adaptive dryland management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clinical observations made by practitioners and reported using web- and mobile-based technologies may benefit disease surveillance by improving the timeliness of outbreak detection. Equinella is a voluntary electronic reporting and information system established for the early detection of infectious equine diseases in Switzerland. Sentinel veterinary practitioners have been able to report cases of non-notifiable diseases and clinical symptoms to an internet-based platform since November 2013. Telephone interviews were carried out during the first year to understand the motivating and constraining factors affecting voluntary reporting and the use of mobile devices in a sentinel network. We found that non-monetary incentives attract sentinel practitioners; however, insufficient understanding of the reporting system and of its relevance, as well as concerns over the electronic dissemination of health data were identified as potential challenges to sustainable reporting. Many practitioners are not yet aware of the advantages of mobile-based surveillance and may require some time to become accustomed to novel reporting methods. Finally, our study highlights the need for continued information feedback loops within voluntary sentinel networks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

CONTEXT Radiolabelled choline positron emission tomography has changed the management of prostate cancer patients. However, new emerging radiopharmaceutical agents, like radiolabelled prostate specific membrane antigen, and new promising hybrid imaging will begin new challenges in the diagnostic field. OBJECTIVE The continuous evolution in nuclear medicine has led to the improvement in the detection of recurrent prostate cancer (PCa), particularly distant metastases. New horizons have been opened for radiolabelled choline positron emission tomography (PET)/computed tomography (CT) as a guide for salvage therapy or for the assessment of systemic therapies. In addition, new tracers and imaging tools have been recently tested, providing important information for the management of PCa patients. Herein we discuss: (1) the available evidence in literature on radiolabelled choline PET and their recent indications, (2) the role of alternative radiopharmaceutical agents, and (3) the advantages of a recent hybrid imaging device (PET/magnetic resonance imaging) in PCa. EVIDENCE ACQUISITION Data from recently published (2010-2015), original articles concerning the role of choline PET/CT, new emerging radiotracers, and a new imaging device are analysed. This review is reported according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. EVIDENCE SYNTHESIS In the restaging phase, the detection rate of choline PET varies between 4% and 97%, mainly depending on the site of recurrence and prostate-specific antigen levels. Both 68gallium (68Ga)-prostate specific membrane antigen and 18F-fluciclovine are shown to be more accurate in the detection of recurrent disease as compared with radiolabelled choline PET/CT. Particularly, Ga68-PSMA has a detection rate of 50% and 68%, respectively for prostate-specific antigen levels < 0.5ng/ml and 0.5-2ng/ml. Moreover, 68Ga- PSMA PET/magnetic resonance imaging demonstrated a particularly higher accuracy in detecting PCa than PET/CT. New tracers, such as radiolabelled bombesin or urokinase-type plasminogen activator receptor, are promising, but few data in clinical practice are available today. CONCLUSIONS Some limitations emerge from the published papers, both for radiolabelled choline PET/CT and also for new radiopharmaceutical agents. Efforts are still needed to enhance the impact of published data in the world of oncology, in particular when new radiopharmaceuticals are introduced into the clinical arena. PATIENT SUMMARY In the present review, the authors summarise the last evidences in clinical practice for the assessment of prostate cancer, by using nuclear medicine modalities, like positron emission tomography/computed tomography and positron emission tomography/magnetic resonance imaging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Standard models of law enforcement involve the apprehension and punishment of a single suspect, but in many contexts, punishment is actually imposed on an entire group known to contain the offender. The advantages of .group punishment. are that the offender is punished with certainty and detection costs are saved. The disadvantage is that innocent individuals are punished. We compare individual and group punishment when social welfare depends on fairness, and when it depends on deterrence. We show that group punishment may dominate in the former case if the detection technology is ineffective but never in the latter case. We discuss our results in the context of several examples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Institutional Review Boards (IRBs) are the primary gatekeepers for the protection of ethical standards of federally regulated research on human subjects in this country. This paper focuses on what general, broad measures that may be instituted or enhanced to exemplify a "model IRB". This is done by examining the current regulatory standards of federally regulated IRBs, not private or commercial boards, and how many of those standards have been found either inadequate or not generally understood or followed. The analysis includes suggestions on how to bring about changes in order to make the IRB process more efficient, less subject to litigation, and create standardized educational protocols for members. The paper also considers how to include better oversight for multi-center research, increased centralization of IRBs, utilization of Data Safety Monitoring Boards when necessary, payment for research protocol review, voluntary accreditation, and the institution of evaluation/quality assurance programs. ^ This is a policy study utilizing secondary analysis of publicly available data. Therefore, the research for this paper focuses on scholarly medical/legal journals, web information from the Department of Health and Human Services, Federal Drug Administration, and the Office of the Inspector General, Accreditation Programs, law review articles, and current regulations applicable to the relevant portions of the paper. ^ Two issues are found to be consistently cited by the literature as major concerns. One is a need for basic, standardized educational requirements across all IRBs and its members, and secondly, much stricter and more informed management of continuing research. There is no federally regulated formal education system currently in place for IRB members, except for certain NIH-based trials. Also, IRBs are not keeping up with research once a study has begun, and although regulated to do so, it does not appear to be a great priority. This is the area most in danger of increased litigation. Other issues such as voluntary accreditation and outcomes evaluation are slowing gaining steam as the processes are becoming more available and more sought after, such as JCAHO accrediting of hospitals. ^ Adopting the principles discussed in this paper should promote better use of a local IRBs time, money, and expertise for protecting the vulnerable population in their care. Without further improvements to the system, there is concern that private and commercial IRBs will attempt to create a monopoly on much of the clinical research in the future as they are not as heavily regulated and can therefore offer companies quicker and more convenient reviews. IRBs need to consider the advantages of charging for their unique and important services as a cost of doing business. More importantly, there must be a minimum standard of education for all IRB members in the area of the ethical standards of human research and a greater emphasis placed on the follow-up of ongoing research as this is the most critical time for study participants and may soon lead to the largest area for litigation. Additionally, there should be a centralized IRB for multi-site trials or a study website with important information affecting the trial in real time. There needs to be development of standards and metrics to assess the performance of the IRBs for quality assurance and outcome evaluations. The boards should not be content to run the business of human subjects' research without determining how well that function is actually being carried out. It is important that federally regulated IRBs provide excellence in human research and promote those values most important to the public at large.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Centers for Disease Control estimates that foodborne diseases cause approximately 76 million illnesses, 325,000 hospitalizations, and 5,000 deaths in the United States each year. The American public is becoming more health conscious and there has been an increase in the dietary intake of fresh fruits and vegetables. Affluence and demand for convenience has allowed consumers to opt for pre-processed packaged fresh fruits and vegetables. These pre-processed foods are considered Ready-to-Eat. They have many of the advantages of fresh produce without the inconvenience of processing at home. After seeing a decline in food-related illnesses between 1996 and 2004, due to an improvement in meat and poultry safety, tainted produce has tilted the numbers back. This has resulted in none of the Healthy People 2010 targets for food-related illness reduction being reached. Irradiation has been shown to be effective in eliminating many of the foodborne pathogens. The application of irradiation as a food safety treatment has been widely endorsed by many of the major associations involved with food safety and public health. Despite these endorsements there has been very little use of this technology to date for reducing the disease burden associated with the consumption of these products. A review of the available literature since the passage of the 1996 Food Quality Protection Act was conducted on the barriers to implementing irradiation as a food safety process for fresh fruits and vegetables. The impediments to adopting widespread utilization of irradiation food processing as a food safety measure involve a complex array of legislative, regulatory, industry, and consumer issues. The FDA’s approval process limits the expansion of the list of foods approved for the application of irradiation as a food safety process. There is also a lack of capacity within the industry to meet the needs of a geographically dispersed industry.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cross-sectional designs, longitudinal designs in which a single cohort is followed over time, and mixed-longitudinal designs in which several cohorts are followed for a shorter period are compared by their precision, potential for bias due to age, time and cohort effects, and feasibility. Mixed longitudinal studies have two advantages over longitudinal studies: isolation of time and age effects and shorter completion time. Though the advantages of mixed-longitudinal studies are clear, choosing an optimal design is difficult, especially given the number of possible combinations of the number of cohorts and number of overlapping intervals between cohorts. The purpose of this paper is to determine the optimal design for detecting differences in group growth rates.^ The type of mixed-longitudinal study appropriate for modeling both individual and group growth rates is called a "multiple-longitudinal" design. A multiple-longitudinal study typically requires uniform or simultaneous entry of subjects, who are each observed till the end of the study.^ While recommendations for designing pure-longitudinal studies have been made by Schlesselman (1973b), Lefant (1990) and Helms (1991), design recommendations for multiple-longitudinal studies have never been published. It is shown that by using power analyses to determine the minimum number of occasions per cohort and minimum number of overlapping occasions between cohorts, in conjunction with a cost model, an optimal multiple-longitudinal design can be determined. An example of systolic blood pressure values for cohorts of males and cohorts of females, ages 8 to 18 years, is given. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The role of clinical chemistry has traditionally been to evaluate acutely ill or hospitalized patients. Traditional statistical methods have serious drawbacks in that they use univariate techniques. To demonstrate alternative methodology, a multivariate analysis of covariance model was developed and applied to the data from the Cooperative Study of Sickle Cell Disease.^ The purpose of developing the model for the laboratory data from the CSSCD was to evaluate the comparability of the results from the different clinics. Several variables were incorporated into the model in order to control for possible differences among the clinics that might confound any real laboratory differences.^ Differences for LDH, alkaline phosphatase and SGOT were identified which will necessitate adjustments by clinic whenever these data are used. In addition, aberrant clinic values for LDH, creatinine and BUN were also identified.^ The use of any statistical technique including multivariate analysis without thoughtful consideration may lead to spurious conclusions that may not be corrected for some time, if ever. However, the advantages of multivariate analysis far outweigh its potential problems. If its use increases as it should, the applicability to the analysis of laboratory data in prospective patient monitoring, quality control programs, and interpretation of data from cooperative studies could well have a major impact on the health and well being of a large number of individuals. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The physical characteristic of protons is that they deliver most of their radiation dose to the target volume and deliver no dose to the normal tissue distal to the tumor. Previously, numerous studies have shown unique advantages of proton therapy over intensity-modulated radiation therapy (IMRT) in conforming dose to the tumor and sparing dose to the surrounding normal tissues and the critical structures in many clinical sites. However, proton therapy is known to be more sensitive to treatment uncertainties such as inter- and intra-fractional variations in patient anatomy. To date, no study has clearly demonstrated the effectiveness of proton therapy compared with the conventional IMRT under the consideration of both respiratory motion and tumor shrinkage in non-small cell lung cancer (NSCLC) patients. Purpose: This thesis investigated two questions for establishing a clinically relevant comparison of the two different modalities (IMRT and proton therapy). The first question was whether or not there are any differences in tumor shrinkage between patients randomized to IMRT versus passively scattered proton therapy (PSPT). Tumor shrinkage is considered a standard measure of radiation therapy response that has been widely used to gauge a short-term progression of radiation therapy. The second question was whether or not there are any differences between the planned dose and 5D dose under the influence of inter- and intra-fractional variations in the patient anatomy for both modalities. Methods: A total of 45 patients (25 IMRT patients and 20 PSPT patients) were used to quantify the tumor shrinkage in terms of the change of the primary gross tumor volume (GTVp). All patients were randomized to receive either IMRT or PSPT for NSCLC. Treatment planning goals were identical for both groups. All patients received 5 to 8 weekly repeated 4-dimensional computed tomography (4DCT) scans during the course of radiation treatments. The original GTVp contours were propagated to T50 of weekly 4DCT images using deformable image registration and their absolute volumes were measured. Statistical analysis was performed to compare the distribution of tumor shrinkage between the two population groups. In order to investigate the difference between the planned dose and the 5D dose with consideration of both breathing motion and anatomical change, we re-calculated new dose distributions at every phase of the breathing cycle for all available weekly 4DCT data sets which resulted 50 to 80 individual dose calculations for each of the 7 patients presented in this thesis. The newly calculated dose distributions were then deformed and accumulated to T50 of the planning 4DCT for comparison with the planned dose distribution. Results: At the end of the treatment, both IMRT and PSPT groups showed mean tumor volume reductions of 23.6% ( 19.2%) and 20.9% ( 17.0 %) respectively. Moreover, the mean difference in tumor shrinkage between two groups is 3% along with the corresponding 95% confidence interval, [-8%, 14%]. The rate of tumor shrinkage was highly correlated with the initial tumor volume size. For the planning dose and 5D dose comparison study, all 7 patients showed a mean difference of 1 % in terms of target coverage for both IMRT and PSPT treatment plans. Conclusions: The results of the tumor shrinkage investigation showed no statistically significant difference in tumor shrinkage between the IMRT and PSPT patients, and the tumor shrinkage between the two modalities is similar based on the 95% confidence interval. From the pilot study of comparing the planned dose with the 5D dose, we found the difference to be only 1%. Overall impression of the two modalities in terms of treatment response as measured by the tumor shrinkage and 5D dose under the influence of anatomical change that were designed under the same protocol (i.e. randomized trial) showed similar result.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper empirically examines the different comparative advantages of two emerging economic giants, China and India, in relation to the different skill distribution patterns in each country. By utilizing industry export data on China and India from 1983 to 2000, we find that a country with a greater dispersion of skills (i.e., India, especially in the earlier years) has higher exports in industries with shorter production chains, whereas a country with a more equal dispersion of skills (i.e., China, especially in the later years) is found to have higher exports in industries with longer production chains. The causal relationship is fairly robust across different specifications. This empirical evidence supports our assumption that the likely mechanism for these results is the negative impact of low-skilled workers on input quality, which accumulates and becomes larger as the length of production chains and the proportion of low-skilled workers in the economy increase.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Once admitted the advantages of object-based classification compared to pixel-based classification; the need of simple and affordable methods to define and characterize objects to be classified, appears. This paper presents a new methodology for the identification and characterization of objects at different scales, through the integration of spectral information provided by the multispectral image, and textural information from the corresponding panchromatic image. In this way, it has defined a set of objects that yields a simplified representation of the information contained in the two source images. These objects can be characterized by different attributes that allow discriminating between different spectral&textural patterns. This methodology facilitates information processing, from a conceptual and computational point of view. Thus the vectors of attributes defined can be used directly as training pattern input for certain classifiers, as for example artificial neural networks. Growing Cell Structures have been used to classify the merged information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Environmental constraints imposed on hydropoweroperation are usually given in the form of minimum environmental flows and maximum and minimum rates of change of flows, or ramp rates. One solution proposed to mitigate the environmental impact caused by the flows discharged by a hydropower plant while reducing the economic impact of the above-mentioned constraints consists in building a re-regulationreservoir, or afterbay, downstream of the power plant. Adding pumpingcapability between the re-regulationreservoir and the main one could contribute both to reducing the size of the re-regulationreservoir, with the consequent environmental improvement, and to improving the economic feasibility of the project, always fulfilling the environmental constraints imposed to hydropoweroperation. The objective of this paper is studying the contribution of a re-regulationreservoir to fulfilling the environmental constraints while reducing the economic impact of said constraints. For that purpose, a revenue-driven optimization model based on mixed integer linear programming is used. Additionally, the advantages of adding pumpingcapability are analysed. In order to illustrate the applicability of the methodology, a case study based on a real hydropower plant is presented

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advances in solid-state lighting have overcome common limitations on optical wireless such as power needs due to light dispersion. It's been recently proposed the modification of lamp's drivers to take advantages of its switching behaviour to include data links maintaining the illumination control they provide. In this paper, a remote access application using visible light communications is presented that provides wireless access to a remote computer using a touchscreen as user interface