878 resultados para Parametric devices
Resumo:
OBJECTIVE: To determine technical procedures and criteria used by Brazilian physicians for measuring blood pressure and diagnosing hypertension. METHODS: A questionnaire with 5 questions about practices and behaviors regarding blood pressure measurement and the diagnosis of hypertension was sent to 25,606 physicians in all Brazilian regions through a mailing list. The responses were compared with the recommendations of a specific consensus and descriptive analysis. RESULTS: Of the 3,621 (14.1%) responses obtained, 57% were from the southeastern region of Brazil. The following items were reported: use of an aneroid device by 67.8%; use of a mercury column device by 14.6%; 11.9% of the participants never calibrated the devices; 35.7% calibrated the devices at intervals < 1 year; 85.8% measured blood pressure in 100% of the medical visits; 86.9% measured blood pressure more than once and on more than one occasion. For hypertension diagnosis, 55.7% considered the patient's age, and only 1/3 relied on consensus statements. CONCLUSION: Despite the adequate frequency of both practices, it was far from that expected, and some contradictions between the diagnostic criterion for hypertension and the number of blood pressure measurements were found. The results suggest that, to include the great majority of the medical professionals, disclosure of consensus statements and techniques for blood pressure measurement should go beyond the boundaries of medical events and specialized journals.
Resumo:
The adoption of a sustainable approach to meeting the energy needs of society has recently taken on a more central and urgent place in the minds of many people. There are many reasons for this including ecological, environmental and economic concerns. One particular area where a sustainable approach has become very relevant is in the production of electricity. The contribution of renewable sources to the energy mix supplying the electricity grid is nothing new, but the focus has begun to move away from the more conventional renewable sources such as wind and hydro. The necessity of exploring new and innovative sources of renewable energy is now seen as imperative as the older forms (i.e. hydro) reach the saturation point of their possible exploitation. One such innovative source of energy currently beginning to be utilised in this regard is tidal energy. The purpose of this thesis is to isolate one specific drawback to tidal energy, which could be considered a roadblock to this energy source being a major contributor to the Irish national grid. This drawback presents itself in the inconsistent nature in which a tidal device generates energy over the course of a 24 hour period. This inconsistency of supply can result in the cycling of conventional power plants in order to even out the supply, subsequently leading to additional costs. The thesis includes a review of literature relevant to the area of tidal and other marine energy sources with an emphasis on the state of the art devices currently in development or production. The research carried out included tidal data analysis and manipulation into a model of the power generating potential at specific sites. A solution is then proposed to the drawback of inconsistency of supply, which involves the positioning of various tidal generation installations at specifically selected locations around the Irish coast. The temporal shift achieved in the power supply profiles of the individual sites by locating the installations in the correct locations, successfully produced an overall power supply profile with the smoother curve and a consistent base load energy supply. Some limitations to the method employed were also outlined, and suggestions for further improvements to the method were made.
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Diss., 2015
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
Estudi elaborat a partir d’una estada a l’ Imperial College London, entre juliol i novembre de 2006. En aquest treball s’ha investigat la geometria més apropiada per a la caracterització de la tenacitat a fractura intralaminar de materials compòsits laminats amb teixit. L’objectiu és assegurar la propagació de l’esquerda sense que la proveta falli abans per cap altre mecanisme de dany per tal de permetre la caracterització experimental de la tenacitat a fractura intralaminar de materials compòsits laminats amb teixit. Amb aquesta fi, s’ha dut a terme l’anàlisi paramètrica de diferents tipus de provetes mitjançant el mètode dels elements finits (FE) combinat amb la virtual crack closure technique (VCCT). Les geometries de les provetes analitzades corresponen a la proveta de l’assaig compact tension (CT) i diferents variacions com la extended compact tension (ECT), la proveta widened compact tension (WCT), tapered compact tension (TCT) i doubly-tapered compact tension (2TCT). Com a resultat d’aquestes anàlisis s’han derivat diferents conclusions per obtenir la geometria de proveta més apropiada per a la caracterització de la tenacitat a fractura intralaminar de materials compòsits laminats amb teixit. A més, també s’han dut a terme una sèrie d’assaigs experimentals per tal de validar els resultats de les anàlisis paramètriques. La concordança trobada entre els resultats numèrics i experimentals és bona tot i la presència d’efectes no previstos durant els assaigs experimentals.
Resumo:
Lean meat percentage (LMP) is the criterion for carcass classification and it must be measured on line objectively. The aim of this work was to compare the error of the prediction (RMSEP) of the LMP measured with the following different devices: Fat-O-Meat’er (FOM), UltraFOM (UFOM), AUTOFOM and -VCS2000. For this reason the same 99 carcasses were measured using all 4 apparatus and dissected according to the European Reference Method. Moreover a subsample of the carcasses (n=77) were fully scanned with a X-ray Computed Tomography equipment (CT). The RMSEP calculated with cross validation leave-one-out was lower for FOM and AUTOFOM (1.8% and 1.9%, respectively) and higher for UFOM and VCS2000 (2.3% for both devices). The error obtained with CT was the lowest (0.96%) in accordance with previous results, but CT cannot be used on line. It can be concluded that FOM and AUTOFOM presented better accuracy than UFOM and VCS2000.
Resumo:
We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.
Resumo:
Background: Temporary percutaneous left ventricular assist devices (TPLVAD) can be inserted and removed in awake patients. They substitute left ventricular function for a period of up to a few weeks and provide an excellent backup and bridge to recovery or decision. Methods: Retrospective analysis of 75 patients who received TPLVAD to treat cardiogenic shock (n = 49) or to facilitate high-risk percutaneous coronary intervention (PCI) (n = 26). Forty-two patients with cardiogenic shock and 16 patients with high-risk PCI received a TandemHeart and 7 patients and 10 patients, respectively, received an Impella Recover LP 2.5. Outcome and related complications up to 1 month are reported with reference to device depending function. Results: One-month survival was 53% in patients with shock and 96% in patients with PCI. Conclusion: TPLVADs can support the failing heart with acceptable risk. Outcome is better in prophylactic use than in patients with cardiogenic shock. (C) 2011 Wiley-Liss, Inc.
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
Hypoglycemia, if recurrent, may have severe consequences on cognitive and psychomotor development of neonates. Therefore, screening for hypoglycemia is a daily routine in every facility taking care of newborn infants. Point-of-care-testing (POCT) devices are interesting for neonatal use, as their handling is easy, measurements can be performed at bedside, demanded blood volume is small and results are readily available. However, such whole blood measurements are challenged by a wide variation of hematocrit in neonates and a spectrum of normal glucose concentration at the lower end of the test range. We conducted a prospective trial to check precision and accuracy of the best suitable POCT device for neonatal use from three leading companies in Europe. Of the three devices tested (Precision Xceed, Abbott; Elite XL, Bayer; Aviva Nano, Roche), Aviva Nano exhibited the best precision. None completely fulfilled the ISO-accuracy-criteria 15197: 2003 or 2011. Aviva Nano fulfilled these criteria in 92% of cases while the others were <87%. Precision Xceed reached the 95% limit of the 2003 ISO-criteria for values ≤4.2 mmol/L, but not for the higher range (71%). Although validated for adults, new POCT devices need to be specifically evaluated on newborn infants before adopting their routine use in neonatology.
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.
Resumo:
Hypoglycaemia is a major cause of neonatal morbidity and may induce long-term developmental sequelae. Clinical signs of hypoglycaemia in neonatal infants are unspecific or even absent, and therefore, precise and accurate methods for the assessment of glycaemia are needed. Glycaemia measurement in newborns has some particularities like a very low limit of normal glucose concentration compared to adults and a large range of normal haematocrit values. Many bedside point-of-care testing (POCT) systems are available, but literature about their accuracy in newborn infants is scarce and not very convincing. In this retrospective study, we identified over a 1-year study period 1,324 paired glycaemia results, one obtained at bedside with one of three different POCT systems (Elite? XL, Ascensia? Contour? and ABL 735) and the other in the central laboratory of the hospital with the hexokinase reference method. All three POCT systems tended to overestimate glycaemia values, and none of them fulfilled the ISO 15197 accuracy criteria. The Elite XL appeared to be more appropriate than Contour to detect hypoglycaemia, however with a low specificity. Contour additionally showed an important inaccuracy with increasing haematocrit. The bench analyzer ABL 735 was the most accurate of the three tested POCT systems. Both of the tested handheld glucometers have important drawbacks in their use as screening tools for hypoglycaemia in newborn infants. ABL 735 could be a valuable alternative, but the blood volume needed is more than 15 times higher than for handheld glucometers. Before daily use in the newborn population, careful clinical evaluation of each new POCT system for glucose measurement is of utmost importance.
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.