571 resultados para Refine


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context. In April 2004, the first image was obtained of a planetary mass companion (now known as 2M 1207 b) in orbit around a self-luminous object different from our own Sun (the young brown dwarf 2MASSW J 1207334-393254, hereafter 2M 1207 A). That 2M 1207 b probably formed via fragmentation and gravitational collapse offered proof that such a mechanism can form bodies in the planetary mass regime. However, the predicted mass, luminosity, and radius of 2MI207 b depend on its age, distance, and other observables, such as effective temperature. Aims. To refine our knowledge of the physical properties of 2M 1207 b and its nature, we accurately determined the distance to the 2M 1207 A and b system by measuring of its trigonometric parallax at the milliarcsec level. Methods. With the ESO NTT/SUS12 telescope, we began a campaign of photometric and astrometric observations in 2006 to measure the trigonometric parallax of 2M 1207 A. Results. An accurate distance (52.4 +/- 1.1 pc) to 2M1207A was measured. From distance and proper motions we derived spatial velocities that are fully compatible with TWA membership. Conclusions. With this new distance estimate, we discuss three scenarios regarding the nature of 2M 1207 b: (1) a cool (1150 +/- 150 K) companion of mass 4 +/- 1 M-Jup (2) a warmer (1600 +/- 100 K) and heavier (8 +/- 2 M-Jup) companion occulted by an edge-on circumsecondary disk, or (3) a hot protoplanet collision afterglow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of finite element analysis (FEA) to design electrical motors has increased significantly in the past few years due the increasingly better performance of modern computers. Even though the analytical software remains the most used tool, the FEA is widely used to refine the analysis and gives the final design to be prototyped. The power factor, a standard data of motor manufactures data sheet is important because it shows how much reactive power is consumed by the motor. This data becomes important when the motor is connected to network. However, the calculation of power factor is not an easy task. Due to the saturation phenomena the input motor current has a high level of harmonics that cannot be neglected. In this work the FEA is used to evaluate a proposed (not limitative) methodology to estimate the power factor or displacement factor of a small single-phase induction motor. Results of simulations and test are compared.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cost of spatial join processing can be very high because of the large sizes of spatial objects and the computation-intensive spatial operations. While parallel processing seems a natural solution to this problem, it is not clear how spatial data can be partitioned for this purpose. Various spatial data partitioning methods are examined in this paper. A framework combining the data-partitioning techniques used by most parallel join algorithms in relational databases and the filter-and-refine strategy for spatial operation processing is proposed for parallel spatial join processing. Object duplication caused by multi-assignment in spatial data partitioning can result in extra CPU cost as well as extra communication cost. We find that the key to overcome this problem is to preserve spatial locality in task decomposition. We show in this paper that a near-optimal speedup can be achieved for parallel spatial join processing using our new algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In view of the relative risk of intracranial haemorrhage and major bleeding with thrombolytic therapy, it is important ro identify as early as possible the low risk patient who may not have a net clinical benefit from thrombolysis in the setting of acute myocardial infarction. An analysis of 5434 hospital-treated patients with myocardial infarction in the Perth MONICA study showed that age below 60 and absence of previous infarction or diabetes, shock, pulmonary oedema, cardiac arrest and Q-wave or left bundle branch block on the initial ECG identified a large group of patients with a 28 day mortality of only 1%, and one year mortality of only 2%. Identification of baseline risk in this way helps refine the risk-benefit equation for thrombolytic therapy, and may help avoid unnecessary use of thrombolysis in those unlikely to benefit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To identify the causes of death and main cardiovascular complications in adolescents and adults with congenitally malformed hearts. Design: Retrospective review of 102 necropsy reports from a tertiary centre obtained over a period of 19 years. Methods: The diagnosis, the operated or non-operated state of the main defect, the cause of death, and main complications were related to the age and gender. Other clinically relevant conditions, and identifiable sequels of previous diseases, were also noted. Results: The ages ranged from 15 to 69 years, with a mean of 31.1 and a median of 28 years, with no difference detected according to the gender. Of the patients, two-thirds had been submitted to at least one cardiac surgery. The mean age of death was significantly higher in non-operated patients (p = 0.003). The most prevalent cause of death in the whole group was related to recent surgery, found in one-third. From them, two-fifths corresponded to reoperations. Among the others, cardiac failure was the main terminal cause in another third, and the second cause was pulmonary thromboembolism in just over one-fifth, presenting a significant association with histopathological signs of pulmonary hypertension (p = 0.011). Infection was the cause of death in 7.8% of the patients, all previously operated. Acute infective endocarditis was present or was the indication for the recent surgery in one-tenth of the patients, this cohort having a mean age of 27.8 years. There was a statistically significant association between the occurrence of endocarditis and defects causing low pulmonary blood flow (p = 0.043). Conclusions: Data derived from necropsies of adults with congenital heart defects can help the multidisciplinary team refine both their diagnosis and treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors present the first clinical implementation of an endoscopic-assisted percutaneous anterolateral radiofrequency cordotomy. The aim of this article is to demonstrate the intradural endoscopic visualization of the cervical spinal cord via a percutaneous approach to refine the spinal target for anterolateral cordotomy, avoiding undesired trauma to the spinal tissue or injury to blood vessels. Initially, a lateral puncture of the spinal canal in the C1-2 interspace is performed, guided by fluoroscopy. As soon as CSF is reached by the guide cannula (17-gauge needle), the endoscope can be inserted for visualization of the spinal cord and its surrounding structures. The endoscopic visualization provided clear identification of the pial surface of the spinal cord, arachnoid membrane, dentate ligament, dorsal and ventral root entry zone, and blood vessels. The target for electrode insertion into the spinal cord was determined to be the midpoint from the dentate ligament and the ventral root entry zone. The endoscopic guidance shortened the fluoroscopy usage time and no intrathecal contrast administration was needed. Cordotomy was performed by a standard radiofrequency method after refining of the neurophysiological target. Satisfactory analgesia was provided by the procedure with no additional complications or CSF leak. The initial use of this technique suggests that a percutaneous endoscopic procedure may be useful for particular manipulation of the spinal cord, possibly adding a degree of safety to the procedure and improving its effectiveness. (DOI: 10.3171/2010.4.JNS091779)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biochemical markers for remission on acromegaly activity are controversial. We studied a subset of treated acromegalic patients with discordant nadir GH levels after oral glucose tolerance test (oGTT) and IGF-I values to refine the current consensus on acromegaly remission. We also compared GH results by two GH immunoassays. From a cohort of 75 treated acromegalic patients, we studied 13 patients who presented an elevated IGF-I despite post-oGTT nadir GH of <= 1 mu g/l. The 12-h daytime GH profile (GH-12 h), nadir GH after oGTT, and basal IGF-I levels were studied in patients and controls. Bland-Altman method showed high concordance between GH assays. Acromegalic patients showed higher mean GH-12 h values (0.71+/-0.36 vs. 0.31+/-0.28 mu g/l; p<0.05) and nadir GH after oGTT (0.48+/-0.32 vs. 0.097+/-0.002 mu g/l; p<0.05) as compared to controls. Nadir GH correlated with mean GH-12 h (r=0.92, p<0.05). The mean GH-12 h value from upper 95% CI of controls (0.54 mu g/l) would correspond to a theoretical normal nadir GH of <= 0.27 mu g/l. Patients with GH nadir <= 0.3 mu g/l had IGF-I between 100-130% ULNR (percentage of upper limit of normal range) and mean GH-12 h of 0.35+/-0.15, and patients with GH nadir >0.3 and <= 1 mu g/l had IGF-I >130% ULNR and mean GH-12 h of 0.93+/-0.24 mu g/l. Our data integrate daytime GH secretion, nadir GH after oGTT, and plasma IGF-I concentrations showing a continuum of mild residual activity in a subgroup of treated acromegaly with nadir GH values <= 1 mu g/l. The degree of increased IGF-I levels and nadir GH after oGTT are correlated with the subtle abnormalities of daytime GH secretion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate the relationship between NF-kappa B activation and hepatic stellate cell (HSC) apoptosis in hepatosplenic schistosomiasis, hepatic biopsies from patients with Schistosoma mansoni-induced periportal fibrosis, hepatitis C virus-induced cirrhosis, and normal liver were submitted to alpha-smooth muscle actin (alpha-SMA) and NF-kappa B p65 immunohistochemistry, as well as to NF-kappa B Southwestern histochemistry and TUNEL assay. The numbers of alpha-SMA-positive cells and NF-kappa B- and NF-kappa B p65-positive HSC nuclei were reduced in schistosomal fibrosis relative to liver cirrhosis. In addition, increased HSC NF-kappa B p65 and TUNEL labeling was observed in schistosomiasis when compared to cirrhosis. These results suggest a possible relationship between the slight activation of the NF-kappa B complex and the increase of apoptotic HSC number in schistosome-induced fibrosis, taking place to a reduced HSC number in schistosomiasis in relation to liver cirrhosis. Therefore, the NF-kappa B pathway may constitute an important down-regulatory mechanism in the pathogenesis of human schistosomiasis mansoni, although further studies are needed to refine the understanding of this process. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concerns of reduced productivity and land degradation in the Mitchell grasslands of central western Queensland were addressed through a range monitoring program to interpret condition and trend. Botanical and eclaphic parameters were recorded along piosphere and grazing gradients, and across fenceline impact areas, to maximise changes resulting from grazing. The Degradation Gradient Method was used in conjunction with State and Transition Models to develop models of rangeland dynamics and condition. States were found to be ordered along a degradation gradient, indicator species developed according to rainfall trends and transitions determined from field data and available literature. Astrebla spp. abundance declined with declining range condition and increasing grazing pressure, while annual grasses and forbs increased in dominance under poor range condition. Soil erosion increased and litter decreased with decreasing range condition. An approach to quantitatively define states within a variable rainfall environment based upon a time-series ordination analysis is described. The derived model could provide the interpretive framework necessary to integrate on-ground monitoring, remote sensing and geographic information systems to trace states and transitions at the paddock scale. However, further work is needed to determine the full catalogue of states and transitions and to refine the model for application at the paddock scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As reformas administrativas brasileiras sempre foram uma forma de tentar aprimorar a gest??o p??blica em nosso pa??s, mas ?? verdade tamb??m que in??meras vezes elas foram usadas como plataforma eleitoral ou ret??rica em esbo??os de programas de Governo. Podemos considerar como a primeira dessas reformas a realizada na d??cada de 1930 pelo Governo de Get??lio Vargas que reduziu bastante as pr??ticas patrimonialistas de gerir o Estado. Posteriormente tivemos, sem muito sucesso, a tentativa de reforma implantada atrav??s edi????o do Decreto-Lei 200 de 1967, durante os governos militares. Com a redemocratiza????o, tentou-se uma reforma no Governo Sarney, da qual resultou como pontos positivos a cria????o da Escola Nacional de Administra????o P??blica e a cria????o da Carreira de Especialista em Pol??ticas P??blicas e Gest??o Governamental (EPPGG). No Governo Collor tentou-se fazer nova reforma administrativa, que n??o logrou ??xito. J?? no Governo Fernando Henrique Cardoso, o Ministro Bresser Pereira foi respons??vel, a partir de 1995, por implementar uma nova e grande reforma administrativa no Brasil. Um dos objetivos dessa Reforma era o fortalecimento do N??cleo Estrat??gico do Estado; para realizar essa tarefa, Bresser Pereira optou por fortalecer as carreiras do chamado Ciclo de Gest??o do Estado, nesse processo ela realizou uma grande reestrutura????o da carreira de EPPGG. Essa reestrutura????o trouxe para o N??cleo Estrat??gico do Estado um corpo de servidores bem treinados e sintonizados com as propostas de gest??o previstas na Reforma. Esses profissionais tiveram uma participa????o marcante nas mudan??as ocorridas na gest??o da Administra????o P??blica durante os dois Governos de Fernando Henrique, participando de v??rios projetos e ocupando v??rios Cargos de Dire????o em todos os n??veis na Administra????o P??blica. O trabalho de sucesso desses profissionais continuou nos dois mandatos seguintes do Presidente Lula. Na constru????o dessa pesquisa foram utilizados v??rios estudos, livros, artigos e entrevistas que levaram ?? conclus??o que a reestrutura????o da Carreira de EPPGG foi um instrumento eficaz para que a Reforma Administrativa de 1995 tivesse sucesso no objetivo de fortalecer o N??cleo Estrat??gico do Estado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the current frontiers in the clinical management of Pectus Excavatum (PE) patients is the prediction of the surgical outcome prior to the intervention. This can be done through computerized simulation of the Nuss procedure, which requires an anatomically correct representation of the costal cartilage. To this end, we take advantage of the costal cartilage tubular structure to detect it through multi-scale vesselness filtering. This information is then used in an interactive 2D initialization procedure which uses anatomical maximum intensity projections of 3D vesselness feature images to efficiently initialize the 3D segmentation process. We identify the cartilage tissue centerlines in these projected 2D images using a livewire approach. We finally refine the 3D cartilage surface through region-based sparse field level-sets. We have tested the proposed algorithm in 6 noncontrast CT datasets from PE patients. A good segmentation performance was found against reference manual contouring, with an average Dice coefficient of 0.75±0.04 and an average mean surface distance of 1.69±0.30mm. The proposed method requires roughly 1 minute for the interactive initialization step, which can positively contribute to an extended use of this tool in clinical practice, since current manual delineation of the costal cartilage can take up to an hour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O trabalho que a seguir se apresenta tem como objectivo descrever a criação de um modelo que sirva de suporte a um sistema de apoio à decisão sobre o risco inerente à execução de projectos na área das Tecnologias de Informação (TI) recorrendo a técnicas de mineração de dados. Durante o ciclo de vida de um projecto, existem inúmeros factores que contribuem para o seu sucesso ou insucesso. A responsabilidade de monitorizar, antever e mitigar esses factores recai sobre o Gestor de Projecto. A gestão de projectos é uma tarefa difícil e dispendiosa, consome muitos recursos, depende de numerosas variáveis e, muitas vezes, até da própria experiência do Gestor de Projecto. Ao ser confrontado com as previsões de duração e de esforço para a execução de uma determinada tarefa, o Gestor de Projecto, exceptuando a sua percepção e intuição pessoal, não tem um modo objectivo de medir a plausibilidade dos valores que lhe são apresentados pelo eventual executor da tarefa. As referidas previsões são fundamentais para a organização, pois sobre elas são tomadas as decisões de planeamento global estratégico corporativo, de execução, de adiamento, de cancelamento, de adjudicação, de renegociação de âmbito, de adjudicação externa, entre outros. Esta propensão para o desvio, quando detectada numa fase inicial, pode ajudar a gerir melhor o risco associado à Gestão de Projectos. O sucesso de cada projecto terminado foi qualificado tendo em conta a ponderação de três factores: o desvio ao orçamentado, o desvio ao planeado e o desvio ao especificado. Analisando os projectos decorridos, e correlacionando alguns dos seus atributos com o seu grau de sucesso o modelo classifica, qualitativamente, um novo projecto quanto ao seu risco. Neste contexto o risco representa o grau de afastamento do projecto ao sucesso. Recorrendo a algoritmos de mineração de dados, tais como, árvores de classificação e redes neuronais, descreve-se o desenvolvimento de um modelo que suporta um sistema de apoio à decisão baseado na classificação de novos projectos. Os modelos são o resultado de um extensivo conjunto de testes de validação onde se procuram e refinam os indicadores que melhor caracterizam os atributos de um projecto e que mais influenciam o risco. Como suporte tecnológico para o desenvolvimento e teste foi utilizada a ferramenta Weka 3. Uma boa utilização do modelo proposto possibilitará a criação de planos de contingência mais detalhados e uma gestão mais próxima para projectos que apresentem uma maior propensão para o risco. Assim, o resultado final pretende constituir mais uma ferramenta à disposição do Gestor de Projecto.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Competitive electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is an electricity market simulator able to model market players and simulate their operation in the market. As market players are complex entities, having their characteristics and objectives, making their decisions and interacting with other players, a multi-agent architecture is used and proved to be adequate. MASCEM players have learning capabilities and different risk preferences. They are able to refine their strategies according to their past experience (both real and simulated) and considering other agents’ behavior. Agents’ behavior is also subject to its risk preferences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Paranoid ideation has been regarded as a cognitive and a social process used as a defence against perceived threats. According to this perspective, paranoid ideation can be understood as a process extending across the normal-pathological continuum. Methods: In order to refine the construct of paranoid ideation and to validate a measure of paranoia, 906 Portuguese participants from the general population and 91 patients were administered the General Paranoia Scale (GPS), and two conceptual models (one - and tridimensional) were compared through confirmatory factor analysis (CFA). Results: Results from the CFA of the GPS confirmed a different model than the one-dimensional model proposed by Fenigstein and Vanable, which com-prised three dimensions (mistrust thoughts, persecutory ideas, and self-deprecation). This alternative model presented a better fit and increased sensitivity when compared with the one-dimensional model. Further data analysis of the scale revealed that the GPS is an adequate assessment tool for adults, with good psychometric characteristics and high internal consistency. Conclusion: The model proposed in the current work leads to further refinements and enrichment of the construct of paranoia in different populations, allowing the assessment of three dimensions of paranoia and the risk of clinical paranoia in a single measure for the general population.