897 resultados para Transformation-based semi-parametric estimators
Resumo:
Medical errors and close calls are pervasive in health care. It is hypothesized that the causes of close calls are the same as for medical errors; therefore learning about close calls can help prevent errors and increase patient safety. Yet despite efforts to encourage close call reporting, close calls as well as medical errors are under-reported in health care. The purpose of this dissertation was to implement and evaluate a web-based anonymous close call reporting system in three units at an urban hospital. ^ The study participants were physicians, nurses and medical technicians (N = 187) who care for patients in the Medical Intermediate Care Unit, the Surgical Intermediate Care Unit, and the Coronary Catheterization Laboratory in the hospital. We provided educational information to the participants on how to use the system and e-mailed and delivered paper reminders to report to the participants throughout the 19-month project. We surveyed the participants at the beginning and at the end of the study to assess their attitudes and beliefs regarding incident reporting. We found that the majority of the health care providers in our study are supportive of incident reporting in general but in practice very few had actually reported an error or a close call, semi-structured interview 20 weeks after we made the close call reporting system available. The purpose of the interviews was to further assess the participants' attitudes regarding incident reporting and the reporting system. Our findings suggest that the health care providers are supportive of medical error reporting in general, but are not convinced of the benefit of reporting close calls. Barriers to close call reporting cited include lack of time, heavy workloads, preferring to take care of close calls "on the spot", and not seeing the benefits of close call reporting. Consequently only two = close calls were reported via the system by two separate caregivers during the project. ^ The findings suggest that future efforts to increase close call reporting must address barriers to reporting, especially the belief among care givers that it is not worth taking time from their already busy schedules to report close calls. ^
Resumo:
The combination of two research projects offered us the opportunity to perform a comprehensive study of the seasonal evolution of the hydrological structure and the circulation of the North Aegean Sea, at the northern extremes of the eastern Mediterranean. The combination of brackish water inflow from the Dardanelles and the sea-bottom relief dictate the significant differences between the North and South Aegean water columns. The relatively warm and highly saline South Aegean waters enter the North Aegean through the dominant cyclonic circulation of the basin. In the North Aegean, three layers of distinct water masses of very different properties are observed: The 20-50 m thick surface layer is occupied mainly by Black Sea Water, modified on its way through the Bosphorus, the Sea of Marmara and the Dardanelles. Below the surface layer there is warm and highly saline water originating in the South Aegean and the Levantine, extending down to 350-400 m depth. Below this layer, the deeper-than-400 m basins of the North Aegean contain locally formed, very dense water with different i/S characteristics at each subbasin. The circulation is characterised by a series of permanent, semi-permanent and transient mesoscale features, overlaid on the general slow cyclonic circulation of the Aegean. The mesoscale activity, while not necessarily important in enhancing isopycnal mixing in the region, in combination with the very high stratification of the upper layers, however, increases the residence time of the water of the upper layers in the general area of the North Aegean. As a result, water having out-flowed from the Black Sea in the winter, forms a separate distinct layer in the region in spring (lying between "younger" BSW and the Levantine origin water), and is still traceable in the water column in late summer.
Resumo:
Introduction : The source and deployment of finance are central issues in economic development. Since 1966, when the Soeharto Administration was inaugurated, Indonesian economic development has relied on funds in the form of aid from international organizations and foreign countries. After the 1990s, a further abundant inflow of capital sustained a rapid economic development. Foreign funding was the basis of Indonesian economic growth. This paper will describe the mechanism for allocating funds in the Indonesian economy. It will identify the problems this mechanism generated in the Indonesian experience, and it will attempt to explain why there was a collapse of the financial system in the wake of the Asian Currency Crisis of 1997. History of the Indonesian Financial system The year 1966 saw the emergence of commercial banks in Indonesia. It can be said that before 1966 a financial system hardly existed, a fact commonly attributed to economic disruptions like the consecutive runs of fiscal deficit and hyperinflation under the Soekarno Administration. After 1996, with the inauguration of Soeharto, a regulatory system of financial legislation, e.g. central banking law and banking regulation, was introduced and implemented, and the banking sector that is the basis of the current financial system in Indonesia was built up. The Indonesian financial structure was significantly altered at the first financial reform of 1983. Between 1966 and 1982, the banking sector consisted of Bank Indonesia (the Central Bank) and the state-owned banks. There was also a system for distributing the abundant public revenue derived from the soaring oil price of the 1970s. The public finance distribution function, incorporated in Indonesian financial system, changed after the successive financial reforms of 1983 and 1988, when there was a move away from the monopoly-market style dominated by state-owned banks (which was a system of public finance distribution that operated at the discretion of the government) towards a modern market mechanism. The five phases of development The Indonesian financial system developed in five phases between 1966 and the present time. The first period (1966-72) was its formative period, the second (1973-82) its policy based finance period under soaring oil prices, the third (1983-91) its financial-reform period, the fourth (1992-97) its period of expansion, and the fifth (1998-) its period of financial restructuring. The first section of this paper summarizes the financial policies operative during each of the periods identified above. In the second section changes to the financial sector in response to policies are examined, and an analysis of these changes shows that an important development of the financial sector occurred during the financial reform period. In the third section the focus of analysis shifts from the general financial sector to particular commercial banks’ performances. In the third section changes in commercial banks’ lending and fund-raising behaviour after the 1990s are analysed by comparing several banking groups in terms of their ownership and foundation time. The last section summarizes the foregoing analyses and examines the problems that remain in the Indonesian financial sector, which is still undergoing restructuring.
Resumo:
El objetivo de este trabajo es generar un modelo Edafogeomorfológico útil en la identificación de necesidades de manejo de suelos, se realizó un estudio en el área de la cuenca Cañada La Gorda Machiques-Colón, estado Zulia, Venezuela, caracterizada por un clima tropical de condición subhúmeda, con duración del periodo de crecimiento (DPC) de 230 días, régimen de humedad Ustic y de temperatura Isohipertémico. Se empleó el enfoque de la ecuación factorial de formación de suelos para el análisis y descripción biofísica de los factores a lo largo de una carena. El relieve caracterizado a partir de fotografías aéreas, imágenes de satélites y de chequeos sistemático mediante transectos en el sentido del flujo del escurrimiento; la vegetación a través del uso de la tierra, la cobertura vegetal, la identificación de las especies dominantes a partir de sus nombres vernáculos y la definición de indicadores de vegetales (Iv). Los suelos fueron descritos y clasificados según la Taxonomía de suelos y valorados mediante el modelo paramétricode Riquier et al. (1970) para determinar el índice de productividad (Ip). Se caracterizaron dos paisajes gemorfológicos: Colinar (C) y Valle (V), seis posiciones geomorfológicas entre ambos paisajes definidas por la sucesión de relieves en el sentido de la pendiente: Tope de colina-loma (TC), mesa conservada (MC), vertientes de mesa alta (VA), media (VM), baja (VB) y valle intracolinar (VI); e igual número de perfiles de suelos representativos, los cuales mostraron edafogénesis muy avanzada con Ip inferiores a 8% en todas las posiciones, exceptuando la VB, con una productividad de 13%. El uso de la tierra es a base de pastoreo semi-intensivo de plantas forrajeras introducidas. Las formaciones vegetales predominante fueron los matorrales y arbustales dispersos, acompañados con restos de una selva tropófila fuertemente afectada por la extracción forestal y la conversión en áreas de pastoreo. Se identificaron 8 Iv, asociados fuertemente con condiciones de físicas e hidrológicas del suelo. El alto impacto de las actividades humanas sobre el suelo y vegetación, expresado a través de los procesos de erosión activa, la ausencia de áreas boscosas y la baja productividad de los sistemas de ganadería reportada para la zona, señalan la necesidad de reorientación del uso actual de la tierra, para lo cual se plantean alternativas como la incorporación de bosques protectores y sistemas agrosilvopastoriles In order to generate an Edaphogeomorphological model to be used for the identification of management requirements of soils, a study was carried out in the area of the Cañada La Gorda watershed, Machiques Colon, Zulia State with a tropical climate, subsumid conditions with a growing period of 230 days, an Ustic soil moisture and Isohypertermic regimes. The soils factorial equation approach was used for the analisis and description of the factors of soil formation throughout a soil catena. Relief was characterized through aerial photographs, satellite images and systematic checks of transects drawn in the sense of surface runoff and also taking into account geomorphological features. Vegetation cover and land use were described and vegetation components were indified by its local names to defined vegetations indicators (VI) for the local conditions. Soils were described and classified according to soil taxonomy and valued by means of a parametric model proposed by Riquier et al, (1970) for determining the productivity index (PI). Two geomorphological landscapes were defined: Hilly and Valley with six positions within the landscapes: hilltop (round or elongated), preserved tableland summit, slopes of high, medium and low tableland and valleys between hills. Representative soils of each position were studied showing a highly advanced degree of edaphogenesis with PI values below 8% in all positions except the valleys with a PI of 13%. Land use type is based on semi intensive pasturing of introduced forage species, with a vegetation of brushwood and scattered shrubs, with some trees relicts of woods affected by timber extraction and turn to grassland Eight VI were identified, highly associated to local physical and hidrological soil conditions. The enormous impact of human activity on soils and vegetation as shown by active erosion processes and absence of wooded areas and the low productivity of livestock systems reported for the area, indicates the necessity of a reorientation of the present land use introducing alternatives like the incorporation of protective woods and agrosilvopastoral management systems.
Resumo:
This study presents a robust method for ground plane detection in vision-based systems with a non-stationary camera. The proposed method is based on the reliable estimation of the homography between ground planes in successive images. This homography is computed using a feature matching approach, which in contrast to classical approaches to on-board motion estimation does not require explicit ego-motion calculation. As opposed to it, a novel homography calculation method based on a linear estimation framework is presented. This framework provides predictions of the ground plane transformation matrix that are dynamically updated with new measurements. The method is specially suited for challenging environments, in particular traffic scenarios, in which the information is scarce and the homography computed from the images is usually inaccurate or erroneous. The proposed estimation framework is able to remove erroneous measurements and to correct those that are inaccurate, hence producing a reliable homography estimate at each instant. It is based on the evaluation of the difference between the predicted and the observed transformations, measured according to the spectral norm of the associated matrix of differences. Moreover, an example is provided on how to use the information extracted from ground plane estimation to achieve object detection and tracking. The method has been successfully demonstrated for the detection of moving vehicles in traffic environments.
Resumo:
When a firm decides to implement ERP softwares, the resulting consequences can pervade all levels, includ- ing organization, process, control and available information. Therefore, the first decision to be made is which ERP solution must be adopted from a wide range of offers and vendors. To this end, this paper describes a methodology based on multi-criteria factors that directly affects the process to help managers make this de- cision. This methodology has been applied to a medium-size company in the Spanish metal transformation sector which is interested in updating its IT capabilities in order to obtain greater control of and better infor- mation about business, thus achieving a competitive advantage. The paper proposes a decision matrix which takes into account all critical factors in ERP selection.
Resumo:
This article examines a new lightweight, slim, high energy efficient, light-transmitting, self-supporting envelope system, providing for seamless, free-form designs for use in architectural projects. The system exploits vacuum insulation panel technology. The research was based on envelope components already existing on the market and patents and prototypes built by independent laboratories, especially components implemented with silica gel insulation, as this is the most effective transparent thermal insulation there is today. The tests run on these materials revealed that there is not one that has all the features required of the new envelope model, although some do have properties that could be exploited to generate this envelope, namely, the vacuum chamber of vacuum insulation panels, the use of monolithic aerogel as insulation in some prototypes, and reinforced polyester barriers. These three design components have been combined and tested to design a new, variable geometry, energy-saving envelope system that also solves many of the problems that other studies ascribe to the use of vacuum insulation panels.
Resumo:
The objective of this paper is to evaluate the behaviour of a controller designed using a parametric Eigenstructure Assignment method and to evaluate its suitability for use in flexible spacecraft. The challenge of this objective lies in obtaining a suitable controller that is specifically designated to alleviate the deflections and vibrations suffered by external appendages in flexible spacecraft while performing attitude manoeuvres. One of the main problems in these vehicles is the mechanical cross-coupling that exists between the rigid and flexible parts of the spacecraft. Spacecraft with fine attitude pointing requirements need precise control of the mechanical coupling to avoid undesired attitude misalignment. In designing an attitude controller, it is necessary to consider the possible vibration of the solar panels and how it may influence the performance of the rest of the vehicle. The nonlinear mathematical model of a flexible spacecraft is considered a close approximation to the real system. During the process of controller evaluation, the design process has also been taken into account as a factor in assessing the robustness of the system.
Resumo:
This article examines a new lightweight, slim, high energy efficient, light-transmitting, selfsupporting envelope system, providing for seamless, free-form designs for use in architectural projects. The system exploits vacuum insulation panel technology. The research was based on envelope components already existing on the market and patents and prototypes built by independent laboratories, especially components implemented with silica gel insulation, as this is the most effective transparent thermal insulation there is today.
Resumo:
We show a procedure for constructing a probabilistic atlas based on affine moment descriptors. It uses a normalization procedure over the labeled atlas. The proposed linear registration is defined by closed-form expressions involving only geometric moments. This procedure applies both to atlas construction as atlas-based segmentation. We model the likelihood term for each voxel and each label using parametric or nonparametric distributions and the prior term is determined by applying the vote-rule. The probabilistic atlas is built with the variability of our linear registration. We have two segmentation strategy: a) it applies the proposed affine registration to bring the target image into the coordinate frame of the atlas or b) the probabilistic atlas is non-rigidly aligning with the target image, where the probabilistic atlas is previously aligned to the target image with our affine registration. Finally, we adopt a graph cut - Bayesian framework for implementing the atlas-based segmentation.
Resumo:
In this paper we will see how the efficiency of the MBS simulations can be improved in two different ways, by considering both an explicit and implicit semi-recursive formulation. The explicit method is based on a double velocity transformation that involves the solution of a redundant but compatible system of equations. The high computational cost of this operation has been drastically reduced by taking into account the sparsity pattern of the system. Regarding this, the goal of this method is the introduction of MA48, a high performance mathematical library provided by Harwell Subroutine Library. The second method proposed in this paper has the particularity that, depending on the case, between 70 and 85% of the computation time is devoted to the evaluation of forces derivatives with respect to the relative position and velocity vectors. Keeping in mind that evaluating these derivatives can be decomposed into concurrent tasks, the main goal of this paper lies on a successful and straightforward parallel implementation that have led to a substantial improvement with a speedup of 3.2 by keeping all the cores busy in a quad-core processor and distributing the workload between them, achieving on this way a huge time reduction by doing an ideal CPU usage
Resumo:
We introduce an innovative, semi-automatic method to transform low resolution facial meshes into high definition ones, based on the tailoring of a generic, neutral human head model, designed by an artist, to fit the facial features of a specific person. To determine these facial features we need to select a set of "control points" (corners of eyes, lips, etc.) in at least two photographs of the subject's face. The neutral head mesh is then automatically reshaped according to the relation between the control points in the original subject's mesh through a set of transformation pyramids. The last step consists in merging both meshes and filling the gaps that appear in the previous process. This algorithm avoids the use of expensive and complicated technologies to obtain depth maps, which also need to be meshed later.
Resumo:
This article examines a new lightweight, slim, high energy efficient, light-transmitting, self-supporting envelope system, providing for seamless, free-form designs for use in architectural projects. The system exploits vacuum insulation panel technology. The research was based on envelope components already existing on the market and patents and prototypes built by independent laboratories, especially components implemented with silica gel insulation, as this is the most effective transparent thermal insulation there is today. The tests run on these materials revealed that there is not one that has all the features required of the new envelope model, although some do have properties that could be exploited to generate this envelope, namely, the vacuum chamber of vacuum insulation panels, the use of monolithic aerogel as insulation in some prototypes, and reinforced polyester barriers. These three design components have been combined and tested to design a new, variable geometry, energy-saving envelope system that also solves many of the problems that other studies ascribe to the use of vacuum insulation panels.
Resumo:
Static analyses of object-oriented programs usually rely on intermediate representations that respect the original semantics while having a more uniform and basic syntax. Most of the work involving object-oriented languages and abstract interpretation usually omits the description of that language or just refers to the Control Flow Graph(CFG) it represents. However, this lack of formalization on one hand results in an absence of assurances regarding the correctness of the transformation and on the other it typically strongly couples the analysis to the source language. In this work we present a framework for analysis of object-oriented languages in which in a first phase we transform the input program into a representation based on Horn clauses. This allows on one hand proving the transformation correct attending to a simple condition and on the other being able to apply an existing analyzer for (constraint) logic programming to automatically derive a safe approximation of the semantics of the original program. The approach is flexible in the sense that the first phase decouples the analyzer from most languagedependent features, and correct because the set of Horn clauses returned by the transformation phase safely approximates the standard semantics of the input program. The resulting analysis is also reasonably scalable due to the use of mature, modular (C)LP-based analyzers. The overall approach allows us to report results for medium-sized programs.
Resumo:
We propose a general framework for assertion-based debugging of constraint logic programs. Assertions are linguistic constructions for expressing properties of programs. We define several assertion schemas for writing (partial) specifications for constraint logic programs using quite general properties, including user-defined programs. The framework is aimed at detecting deviations of the program behavior (symptoms) with respect to the given assertions, either at compile-time (i.e., statically) or run-time (i.e., dynamically). We provide techniques for using information from global analysis both to detect at compile-time assertions which do not hold in at least one of the possible executions (i.e., static symptoms) and assertions which hold for all possible executions (i.e., statically proved assertions). We also provide program transformations which introduce tests in the program for checking at run-time those assertions whose status cannot be determined at compile-time. Both the static and the dynamic checking are provably safe in the sense that all errors flagged are definite violations of the pecifications. Finally, we report briefly on the currently implemented instances of the generic framework.