883 resultados para non separable data
Resumo:
OBJECTIVE: The aim of the study was to identify the variables that predict the revolving door phenomenon in psychiatric hospital at the moment of a second admission. METHODS: The sample consisted of 3,093 patients who have been followed during 5 to 24 years after their first hospital admission due to schizophrenia, and affective or psychotic disorders. Those who had had four or more admissions during the study period were considered as revolving door patients. Logistic regression analyses were used to assess the impact of gender, age, marital status, urban conditions, diagnosis, mean period of stay on the first admission, interval between the first and second admissions on the patterns of hospitalization. RESULTS: The variables with the highest predictive power for readmission were the interval between first and second admissions, and the length of stay in the first admission. CONCLUSIONS: These data may help public health planners in providing optimal care to a small group of patients with more effective utilization of the available services.
Resumo:
The introduction of Electric Vehicles (EVs) together with the implementation of smart grids will raise new challenges to power system operators. This paper proposes a demand response program for electric vehicle users which provides the network operator with another useful resource that consists in reducing vehicles charging necessities. This demand response program enables vehicle users to get some profit by agreeing to reduce their travel necessities and minimum battery level requirements on a given period. To support network operator actions, the amount of demand response usage can be estimated using data mining techniques applied to a database containing a large set of operation scenarios. The paper includes a case study based on simulated operation scenarios that consider different operation conditions, e.g. available renewable generation, and considering a diversity of distributed resources and electric vehicles with vehicle-to-grid capacity and demand response capacity in a 33 bus distribution network.
Resumo:
In this work the identification and diagnosis of various stages of chronic liver disease is addressed. The classification results of a support vector machine, a decision tree and a k-nearest neighbor classifier are compared. Ultrasound image intensity and textural features are jointly used with clinical and laboratorial data in the staging process. The classifiers training is performed by using a population of 97 patients at six different stages of chronic liver disease and a leave-one-out cross-validation strategy. The best results are obtained using the support vector machine with a radial-basis kernel, with 73.20% of overall accuracy. The good performance of the method is a promising indicator that it can be used, in a non invasive way, to provide reliable information about the chronic liver disease staging.
Resumo:
OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.
Resumo:
This article is is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. Attribution-NonCommercial (CC BY-NC) license lets others remix, tweak, and build upon work non-commercially, and although the new works must also acknowledge & be non-commercial.
Resumo:
A detailed analytic and numerical study of baryogenesis through leptogenesis is performed in the framework of the standard model of electroweak interactions extended by the addition of three right-handed neutrinos, leading to the seesaw mechanism. We analyze the connection between GUT-motivated relations for the quark and lepton mass matrices and the possibility of obtaining a viable leptogenesis scenario. In particular, we analyze whether the constraints imposed by SO(10) GUTs can be compatible with all the available solar, atmospheric and reactor neutrino data and, simultaneously, be capable of producing the required baryon asymmetry via the leptogenesis mechanism. It is found that the Just-So(2) and SMA solar solutions lead to a viable leptogenesis even for the simplest SO(10) GUT, while the LMA, LOW and VO solar solutions would require a different hierarchy for the Dirac neutrino masses in order to generate the observed baryon asymmetry. Some implications on CP violation at low energies and on neutrinoless double beta decay are also considered. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Ancillary services represent a good business opportunity that must be considered by market players. This paper presents a new methodology for ancillary services market dispatch. The method considers the bids submitted to the market and includes a market clearing mechanism based on deterministic optimization. An Artificial Neural Network is used for day-ahead prediction of Regulation Down, regulation-up, Spin Reserve and Non-Spin Reserve requirements. Two test cases based on California Independent System Operator data concerning dispatch of Regulation Down, Regulation Up, Spin Reserve and Non-Spin Reserve services are included in this paper to illustrate the application of the proposed method: (1) dispatch considering simple bids; (2) dispatch considering complex bids.
Resumo:
This study aimed to carry out experimental work to determine, for Newtonian and non-Newtonian fluids, the friction factor (fc) with simultaneous heat transfer, at constant wall temperature as boundary condition, in fully developed laminar flow inside a vertical helical coil. The Newtonian fluids studied were aqueous solutions of glycerol, 25%, 36%, 43%, 59% and 78% (w/w). The non-Newtonian fluids were aqueous solutions of carboxymethylcellulose (CMC), a polymer, with concentrations of 0.2%, 0.3%, 0.4% and 0.6% (w/w) and aqueous solutions of xanthan gum (XG), another polymer, with concentrations of 0.1% and 0.2% (w/w). According to the rheological study done, the polymer solutions had shear-thinning behavior and different values of viscoelasticity. The helical coil used has an internal diameter, curvature ratio, length and pitch, respectively: 0.00483 m, 0.0263, 5.0 m and 11.34 mm. It was concluded that the friction factors, with simultaneous heat transfer, for Newtonian fluids can be calculated using expressions from literature for isothermal flows. The friction factors for CMC and XG solutions are similar to those for Newtonian fluids when the Dean number, based in a generalized Reynolds number, is less than 80. For Dean numbers higher than 80, the friction factors of the CMC solutions are lower those of the XG solutions and of the Newtonian fluids. In this range the friction factors decrease with the increase of the viscometric component of the solution and increase for increasing elastic component. The change of behavior at Dean number 80, for Newtonian and non-Newtonian fluids, is in accordance with the study of Ali [4]. There is a change of behavior at Dean number 80, for Newtonian and non-Newtonian fluids, which is in according to previous studies. The data also showed that the use of the bulk temperature or of the film temperature to calculate the physical properties of the fluid has a residual effect in the friction factor values.
Resumo:
Estuaries are perhaps the most threatened environments in the coastal fringe; the coincidence of high natural value and attractiveness for human use has led to conflicts between conservation and development. These conflicts occur in the Sado Estuary since its location is near the industrialised zone of Peninsula of Setúbal and at the same time, a great part of the Estuary is classified as a Natural Reserve due to its high biodiversity. These facts led us to the need of implementing a model of environmental management and quality assessment, based on methodologies that enable the assessment of the Sado Estuary quality and evaluation of the human pressures in the estuary. These methodologies are based on indicators that can better depict the state of the environment and not necessarily all that could be measured or analysed. Sediments have always been considered as an important temporary source of some compounds or a sink for other type of materials or an interface where a great diversity of biogeochemical transformations occur. For all this they are of great importance in the formulation of coastal management system. Many authors have been using sediments to monitor aquatic contamination, showing great advantages when compared to the sampling of the traditional water column. The main objective of this thesis was to develop an estuary environmental management framework applied to Sado Estuary using the DPSIR Model (EMMSado), including data collection, data processing and data analysis. The support infrastructure of EMMSado were a set of spatially contiguous and homogeneous regions of sediment structure (management units). The environmental quality of the estuary was assessed through the sediment quality assessment and integrated in a preliminary stage with the human pressure for development. Besides the earlier explained advantages, studying the quality of the estuary mainly based on the indicators and indexes of the sediment compartment also turns this methodology easier, faster and human and financial resource saving. These are essential factors to an efficient environmental management of coastal areas. Data management, visualization, processing and analysis was obtained through the combined use of indicators and indices, sampling optimization techniques, Geographical Information Systems, remote sensing, statistics for spatial data, Global Positioning Systems and best expert judgments. As a global conclusion, from the nineteen management units delineated and analyzed three showed no ecological risk (18.5 % of the study area). The areas of more concern (5.6 % of the study area) are located in the North Channel and are under strong human pressure mainly due to industrial activities. These areas have also low hydrodynamics and are, thus associated with high levels of deposition. In particular the areas near Lisnave and Eurominas industries can also accumulate the contamination coming from Águas de Moura Channel, since particles coming from that channel can settle down in that area due to residual flow. In these areas the contaminants of concern, from those analyzed, are the heavy metals and metalloids (Cd, Cu, Zn and As exceeded the PEL guidelines) and the pesticides BHC isomers, heptachlor, isodrin, DDT and metabolits, endosulfan and endrin. In the remain management units (76 % of the study area) there is a moderate impact potential of occurrence of adverse ecological effects and in some of these areas no stress agents could be identified. This emphasizes the need for further research, since unmeasured chemicals may be causing or contributing to these adverse effects. Special attention must be taken to the units with moderate impact potential of occurrence of adverse ecological effects, located inside the natural reserve. Non-point source pollution coming from agriculture and aquaculture activities also seem to contribute with important pollution load into the estuary entering from Águas de Moura Channel. This pressure is expressed in a moderate impact potential for ecological risk existent in the areas near the entrance of this Channel. Pressures may also came from Alcácer Channel although they were not quantified in this study. The management framework presented here, including all the methodological tools may be applied and tested in other estuarine ecosystems, which will also allow a comparison between estuarine ecosystems in other parts of the globe.
Resumo:
Mestrado em Contabilidade e Gestão das Instituições Financeiras
Resumo:
OBJECTIVE To estimate rates of non-adherence to telemedicine strategies aimed at treating drug addiction. METHODS A systematic review was conducted of randomized controlled trials investigating different telemedicine treatment methods for drug addiction. The following databases were consulted between May 18, 2012 and June 21, 2012: PubMed, PsycINFO, SciELO, Wiley (The Cochrane Library), Embase, Clinical trials and Google Scholar. The Grading of Recommendations Assessment, Development and Evaluation was used to evaluate the quality of the studies. The criteria evaluated were: appropriate sequence of data generation, allocation concealment, blinding, description of losses and exclusions and analysis by intention to treat. There were 274 studies selected, of which 20 were analyzed. RESULTS Non-adherence rates varied between 15.0% and 70.0%. The interventions evaluated were of at least three months duration and, although they all used telemedicine as support, treatment methods differed. Regarding the quality of the studies, the values also varied from very poor to high quality. High quality studies showed better adherence rates, as did those using more than one technique of intervention and a limited treatment time. Mono-user studies showed better adherence rates than poly-user studies. CONCLUSIONS Rates of non-adherence to treatment involving telemedicine on the part of users of psycho-active substances differed considerably, depending on the country, the intervention method, follow-up time and substances used. Using more than one technique of intervention, short duration of treatment and the type of substance used by patients appear to facilitate adherence.
Resumo:
A detailed analysis of fabrics of the chilled margin of a thick dolerite dyke (Foum Zguid dyke, Southern Morocco) was performed in order to better understand the development of sub-fabrics during dyke emplacement and cooling. AMS data were complemented with measurements of paramagnetic and ferrimagnetic fabrics (measured with high field torque magnetometer), neutron texture and microstructural analyses. The ferrimagnetic and AMS fabrics are similar, indicating that the ferrimagnetic minerals dominate the AMS signal. The paramagnetic fabric is different from the previous ones. Based on the crystallization timing of the different mineralogical phases, the paramagnetic fabric appears related to the upward flow, while the ferrimagnetic fabric rather reflects the late-stage of dyke emplacement and cooling stresses. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
We analyse the possibility that, in two Higgs doublet models, one or more of the Higgs couplings to fermions or to gauge bosons change sign, relative to the respective Higgs Standard Model couplings. Possible sign changes in the coupling of a neutral scalar to charged ones are also discussed. These wrong signs can have important physical consequences, manifesting themselves in Higgs production via gluon fusion or Higgs decay into two gluons or into two photons. We consider all possible wrong sign scenarios, and also the symmetric limit, in all possible Yukawa implementations of the two Higgs doublet model, in two different possibilities: the observed Higgs boson is the lightest CP-even scalar, or the heaviest one. We also analyse thoroughly the impact of the currently available LHC data on such scenarios. With all 8 TeV data analysed, all wrong sign scenarios are allowed in all Yukawa types, even at the 1 sigma level. However, we will show that B-physics constraints are crucial in excluding the possibility of wrong sign scenarios in the case where tan beta is below 1. We will also discuss the future prospects for probing the wrong sign scenarios at the next LHC run. Finally we will present a scenario where the alignment limit could be excluded due to non-decoupling in the case where the heavy CP-even Higgs is the one discovered at the LHC.
Resumo:
In the field of appearance-based robot localization, the mainstream approach uses a quantized representation of local image features. An alternative strategy is the exploitation of raw feature descriptors, thus avoiding approximations due to quantization. In this work, the quantized and non-quantized representations are compared with respect to their discriminativity, in the context of the robot global localization problem. Having demonstrated the advantages of the non-quantized representation, the paper proposes mechanisms to reduce the computational burden this approach would carry, when applied in its simplest form. This reduction is achieved through a hierarchical strategy which gradually discards candidate locations and by exploring two simplifying assumptions about the training data. The potential of the non-quantized representation is exploited by resorting to the entropy-discriminativity relation. The idea behind this approach is that the non-quantized representation facilitates the assessment of the distinctiveness of features, through the entropy measure. Building on this finding, the robustness of the localization system is enhanced by modulating the importance of features according to the entropy measure. Experimental results support the effectiveness of this approach, as well as the validity of the proposed computation reduction methods.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica