1000 resultados para flash point
Resumo:
The year 2014 was rich in significant advances in all areas of internal medicine. Many of them have an impact on our daily practice and on the way we manage one problem or another. From the use of the ultrasound for the diagnosis of pneumonia to the choice of the site of venous access and the type of line, and the increasing complexity of choosing an oral anticoagulant agent, this selection offers to the readers a brief overview of the major advances. The chief residents in the Service of internal medicine of the Lausanne University hospital are pleased to share their readings.
Resumo:
NlmCategory="UNASSIGNED">We report outcomes of a clinical audit examining criteria used in clinical practice to rationalize endotracheal tube (ETT) suction, and the extent these matched criteria in the Endotracheal Suction Assessment Tool(ESAT)©. A retrospective audit of patient notes (N = 292) and analyses of criteria documented by pediatric intensive care nurses to rationalize ETT suction were undertaken. The median number of documented respiratory and ventilation status criteria per ETT suction event that matched the ESAT© criteria was 2 [Interquartile Range (IQR) 1-6]. All criteria listed within the ESAT© were documented within the reviewed notes. A direct link was established between criteria used for current clinical practice of ETT suction and the ESAT©. The ESAT©, therefore, reflects documented clinical decision making and could be used as both a clinical and educational guide for inexperienced pediatric critical care nurses. Modification to the ESAT© requires "preparation for extubation" to be added.
Resumo:
The research on language equations has been active during last decades. Compared to the equations on words the equations on languages are much more difficult to solve. Even very simple equations that are easy to solve for words can be very hard for languages. In this thesis we study two of such equations, namely commutation and conjugacy equations. We study these equations on some limited special cases and compare some of these results to the solutions of corresponding equations on words. For both equations we study the maximal solutions, the centralizer and the conjugator. We present a fixed point method that we can use to search these maximal solutions and analyze the reasons why this method is not successful for all languages. We give also several examples to illustrate the behaviour of this method.
Resumo:
The goal of this thesis is to implement software for creating 3D models from point clouds. Point clouds are acquired with stereo cameras, monocular systems or laser scanners. The created 3D models are triangular models or NURBS (Non-Uniform Rational B-Splines) models. Triangular models are constructed from selected areas from the point clouds and resulted triangular models are translated into a set of quads. The quads are further translated into an estimated grid structure and used for NURBS surface approximation. Finally, we have a set of NURBS surfaces which represent the whole model. The problem wasn’t so easy to solve. The selected triangular surface reconstruction algorithm did not deal well with noise in point clouds. To handle this problem, a clustering method is introduced for simplificating the model and removing noise. As we had better results with the smaller point clouds produced by clustering, we used points in clusters to better estimate the grids for NURBS models. The overall results were good when the point cloud did not have much noise. The point clouds with small amount of error had good results as the triangular model was solid. NURBS surface reconstruction performed well on solid models.
Resumo:
The present paper shows an in-depth analysis of the evolution of floods and precipitation in Catalonia for the period 1981-2010. In order to have homogeneous information, and having in mind that not gauge data was available for all the events, neither for all the rivers and stream flows, daily press from a specific newspaper has been systematically analysed for this period. Furthermore a comparison with a longer period starting in 1900 has been done. 219 flood events (mainly flash flood events) have been identified for the period of 30 years (375 starting in 1900), 79 of them were ordinary, 117 of them were extraordinary and 23 of them were catastrophic, being autumn and summer the seasons with the maxima values. 19% of the events caused a total of 110 casualties. 60% of them died when they tried to cross the street or the stream. Factors like the evolution of precipitation, population density and other socio-economical aspects have been considered. The trend analysis shows an increase of 1 flood/decade that probably has been mainly due to inter-annual and intra-annual changes in population density and in land-use and land-cover.
Resumo:
Emission trading with greenhouse gases and green certificates are part if the climate policy the main target of which is reduce greenhouse gas emissions. The carbon dioxide and fine particle emissions of energy production in Helsinki Metropolitan area are calculated in this study. The analysis is made mainly by district heating point of view and the changes of the district heating network are assessed. Carbon dioxide emissions would be a bit higher, if the district heating network is expanded, but then the fine particle emissions would be much lower. Carbon dioxide emissions are roughly 10 % higher, if the district heating network is expanded at same rate as it has in past five years in the year 2030. The expansion of district heating network would decrease the fine particle emissions about 40 %. The cost of the expansion is allocated to be reduction cost of the fine particle emissions, which is considerably higher than the traditional reduction methods costs. The possible new nuclear plant would reduce the emissions considerably and the costs of the nuclear plant would be relatively low comparing the other energy production methods.
Resumo:
In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms
Resumo:
We introduce a method for surface reconstruction from point sets that is able to cope with noise and outliers. First, a splat-based representation is computed from the point set. A robust local 3D RANSAC-based procedure is used to filter the point set for outliers, then a local jet surface - a low-degree surface approximation - is fitted to the inliers. Second, we extract the reconstructed surface in the form of a surface triangle mesh through Delaunay refinement. The Delaunay refinement meshing approach requires computing intersections between line segment queries and the surface to be meshed. In the present case, intersection queries are solved from the set of splats through a 1D RANSAC procedure