901 resultados para Figure and figurative
Resumo:
OBJECTIVE: To study the neurocognitive profile and its relationship to prefrontal dysfunction in non-demented Parkinson's disease (PD) with deficient haptic perception. METHODS: Twelve right-handed patients with PD and 12 healthy control subjects underwent thorough neuropsychological testing including Rey complex figure, Rey auditory verbal and figural learning test, figural and verbal fluency, and Stroop test. Test scores reflecting significant differences between patients and healthy subjects were correlated with the individual expression coefficients of one principal component, obtained in a principal component analysis of an oxygen-15-labeled water PET study exploring somatosensory discrimination that differentiated between the two groups and involved prefrontal cortices. RESULTS: We found significantly decreased total scores for the verbal learning trials and verbal delayed free recall in PD patients compared with normal volunteers. Further analysis of these parameters using Spearman's ranking correlation showed a significantly negative correlation of deficient verbal recall with expression coefficients of the principal component whose image showed a subcortical-cortical network, including right dorsolateral-prefrontal cortex, in PD patients. CONCLUSION: PD patients with disrupted right dorsolateral prefrontal cortex function and associated diminished somatosensory discrimination are impaired also in verbal memory functions. A negative correlation between delayed verbal free recall and PET activation in a network including the prefrontal cortices suggests that verbal cues and accordingly declarative memory processes may be operative in PD during activities that demand sustained attention such as somatosensory discrimination. Verbal cues may be compensatory in nature and help to non-specifically enhance focused attention in the presence of a functionally disrupted prefrontal cortex.
Resumo:
RATIONALE AND OBJECTIVES: To evaluate the effect of automatic tube current modulation on radiation dose and image quality for low tube voltage computed tomography (CT) angiography. MATERIALS AND METHODS: An anthropomorphic phantom was scanned with a 64-section CT scanner using following tube voltages: 140 kVp (Protocol A), 120 kVp (Protocol B), 100 kVp (Protocol C), and 80 kVp (Protocol D). To achieve similar noise, combined z-axis and xy-axes automatic tube current modulation was applied. Effective dose (ED) for the four tube voltages was assessed. Three plastic vials filled with different concentrations of iodinated solution were placed on the phantom's abdomen to obtain attenuation measurements. The signal-to-noise ratio (SNR) was calculated and a figure of merit (FOM) for each iodinated solution was computed as SNR(2)/ED. RESULTS: The ED was kept similar for the four different tube voltages: (A) 5.4 mSv +/- 0.3, (B) 4.1 mSv +/- 0.6, (C) 3.9 mSv +/- 0.5, and (D) 4.2 mSv +/- 0.3 (P > .05). As the tube voltage decreased from 140 to 80 kVp, image noise was maintained (range, 13.8-14.9 HU) (P > .05). SNR increased as the tube voltage decreased, with an overall gain of 119% for the 80-kVp compared to the 140-kVp protocol (P < .05). The FOM results indicated that with a reduction of the tube voltage from 140 to 120, 100, and 80 kVp, at constant SNR, ED was reduced by a factor of 2.1, 3.3, and 5.1, respectively, (P < .001). CONCLUSIONS: As tube voltage decreases, automatic tube current modulation for CT angiography yields either a significant increase in image quality at constant radiation dose or a significant decrease in radiation dose at a constant image quality.
Resumo:
In this paper I first discuss some non-causal change constructions which have largely gone unnoticed in the literature, such as The butler bowed the guests in (which is said to code mild causation) and The supporters booed Newcastle off at the interval (which only codes temporal coextension between its two constitutive subevents). Since the same structure (i.e. the transitive object-oriented change construction) can be used to code a wide spectrum of causal and temporal relations, the question arises of what cognitive mechanisms may be involved in such meaning shifts. I argue that variation can be motivated on the basis of the figure/ground segregation which the conceptualiser can impose upon the integrated scene coded by the change construction. The integrated scene depicts a force-dynamic scenario but also evokes a unique temporal setting (i.e. temporal overlap or coextension between the constitutive subevents). Such a “bias” towards temporal overlap can be used by the conceptualiser to background causation and highlight temporal overlap interpretations. It is also shown that figure/ground segregation can be appealed to to account for the causal interpretation of intransitive change constructions, e.g. The kettle boiled dry. If the conceptual distance between the verbal event and the non-verbal event is (relatively) great, causality can be highlighted even in intransitive patterns.
Resumo:
Telescopic systems of structural members with clearance are found in many applications, e.g., mobile cranes, rack feeders, fork lifters, stacker cranes (see Figure 1). Operating these machines, undesirable vibrations may reduce the performance and increase safety problems. Therefore, this contribution has the aim to reduce these harmful vibrations. For a better understanding, the dynamic behaviour of these constructions is analysed. The main interest is the overlapping area of each two sections of the above described systems (see markings in Figure 1) which is investigated by measurements and by computations. A test rig is constructed to determine the dynamic behaviour by measuring fundamental vibrations and higher frequent oscillations, damping coefficients, special appearances and more. For an appropriate physical model, the governing boundary value problem is derived by applying Hamilton’s principle and a classical discretisation procedure is used to generate a coupled system of nonlinear ordinary differential equations as the corresponding truncated mathematical model. On the basis of this model, a controller concept for preventing harmful vibrations is developed.
Resumo:
Perceptual fluency is the subjective experience of ease with which an incoming stimulus is processed. Although perceptual fluency is assessed by speed of processing, it remains unclear how objective speed is related to subjective experiences of fluency. We present evidence that speed at different stages of the perceptual process contributes to perceptual fluency. In an experiment, figure-ground contrast influenced detection of briefly presented words, but not their identification at longer exposure durations. Conversely, font in which the word was written influenced identification, but not detection. Both contrast and font influenced subjective fluency. These findings suggest that speed of processing at different stages condensed into a unified subjective experience of perceptual fluency.
Resumo:
Futures did reduce price risk. Hedging produced a higher minimum return and higher return at the 25th percentile (75% of the returns are better than this figure) than did the cash market. The 50th percentile, or median return, was higher for yearlings in the cash market than hedged cattle, and the calves had mixed results. Although the differences are not great, there have been months when the option strategies performed better than cash or futures, (i.e., January–April and September–October), and there are months when they did not fare well (i.e., June–August).
Resumo:
Self – assembly is a powerful tool for the construction of highly organized nanostructures. Therefore, the possibility to control and predict pathways of molecular ordering on the nanoscale level is a critical issue for the production of materials with tunable and adaptive macroscopic properties. 2D polymers are attractive objects for the field of material sciences due to their exceptional properties. [1] As shown before, amphiphilic oligopyrenotides (produced via automated solid-phase synthesis) form rod–like supramolecular polymers in water. [2] These assemblies form 1D objects. [3] By applying certain changes to the design of the oligopyrenotide units the dimensionality of the formed assemblies can be influenced. Herein, we demonstrate that Py3 (see Figure 1) forms defined supramolecular assemblies under thermodynamic conditions in water. To study Py3 self-assembly, we carried out whole set of spectroscopic (UV/vis, fluorescence, DLS) and microscopic experiments (AFM). The obtained results suggest that oligopyrenotides with the present type of geometry and linker length leads to formation of 2D supramolecular assemblies.
Resumo:
Several authors have demonstrated an increased number of mitotic figures in breast cancer resection specimen when compared with biopsy material. This has been ascribed to a sampling artifact where biopsies are (i) either too small to allow formal mitotic figure counting or (ii) not necessarily taken form the proliferating tumor periphery. Herein, we propose a different explanation for this phenomenon. Biopsy and resection material of 52 invasive ductal carcinomas was studied. We counted mitotic figures in 10 representative high power fields and quantified MIB-1 immunohistochemistry by visual estimation, counting and image analysis. We found that mitotic figures were elevated by more than three-fold on average in resection specimen over biopsy material from the same tumors (20±6 vs 6±2 mitoses per 10 high power fields, P=0.008), and that this resulted in a relative diminution of post-metaphase figures (anaphase/telophase), which made up 7% of all mitotic figures in biopsies but only 3% in resection specimen (P<0.005). At the same time, the percentages of MIB-1 immunostained tumor cells among total tumor cells were comparable in biopsy and resection material, irrespective of the mode of MIB-1 quantification. Finally, we found no association between the size of the biopsy material and the relative increase of mitotic figures in resection specimen. We propose that the increase in mitotic figures in resection specimen and the significant shift towards metaphase figures is not due to a sampling artifact, but reflects ongoing cell cycle activity in the resected tumor tissue due to fixation delay. The dwindling energy supply will eventually arrest tumor cells in metaphase, where they are readily identified by the diagnostic pathologist. Taken together, we suggest that the rapidly fixed biopsy material better represents true tumor biology and should be privileged as predictive marker of putative response to cytotoxic chemotherapy.
Resumo:
In the United States, endometrial cancer is the leading cancer of the female reproductive tract. There are 40,100 new cases and 7,470 deaths from endometrial cancer estimated for 2008 (47). The average five year survival rate for endometrial cancer is 84% however, this figure is substantially lower in patients diagnosed with late stage, advanced disease and much higher for patients diagnosed in early stage disease (47). Endometrial cancer (EC) has been associated with several risk factors including obesity, diabetes, hypertension, previously documented occurrence of hereditary non-polyposis colorectal cancer (HNPCC), and heightened exposure to estrogen (25). As of yet, there has not been a dependable molecular predictor of endometrial cancer occurrence in women with these predisposing factors. The goal of our lab is to identify genes that are aberrantly expressed in EC and may serve as molecular biomarkers of EC progression. One candidate protein that we are exploring as a biomarker of EC progression is the cell survival protein survivin.
Resumo:
Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.
Resumo:
The nineteenth century uncovered and analysed the tragic episodes of witch-hunting and ‘witch’ trials common in Renaissance Europe. Fascinating not only to historians, this subject also inspired men of letters who popularized the image of the witch as an old, ugly and evil person, who thus deserved her lot. Jules Michelet’s La sorcière of 1862 takes a very different approach. Simultaneously a literary and historical work, the book proved scandalous as it rehabilitated the figure of the witch, shedding favourable light on her image: it was the witch who was able to save a last spark of humanity in moments of despair; it was she who acted as comforter and healer to the people. In the context of nineteenth-century literature, certain works by female authors that focused on ‘witches,’ stand out. Whilst certain male authors (Michelet included) presented the witch as a figure from the past, who had finally perished in the 17th century, texts such as George Sand’s La petite Fadette (1848) or Eliza Orzeszkowa’s Dziurdziowie (1885), suggest that the end of witch trials did not imply an end to accusations, persecutions, and even executions of ‘witches’ – and, that in terms of culture, witchcraft or sorcery had not disappeared from the societies they knew.
Resumo:
Barry Saltzman was a giant in the fields of meteorology and climate science. A leading figure in the study of weather and climate for over 40 yr, he has frequently been referred to as the "father of modern climate theory." Ahead of his time in many ways, Saltzman made significant contributions to our understanding of the general circulation and spectral energetics budget of the atmosphere, as well as climate change across a wide spectrum of time scales. In his endeavor to develop a unified theory of how the climate system works, lie played a role in the development of energy balance models, statistical dynamical models, and paleoclimate dynamical models. He was a pioneer in developing meteorologically motivated dynamical systems, including the progenitor of Lorenz's famous chaos model. In applying his own dynamical-systems approach to long-term climate change, he recognized the potential for using atmospheric general circulation models in a complimentary way. In 1998, he was awarded the Carl-Gustaf Rossby medal, the highest honor of the American Meteorological Society "for his life-long contributions to the study of the global circulation and the evolution of the earth's climate." In this paper, the authors summarize and place into perspective some of the most significant contributions that Barry Saltzman made during his long and distinguished career. This short review also serves as an introduction to the papers in this special issue of the Journal of Climate dedicated to Barry's memory.
Resumo:
INTRODUCTION Proteinuria (PTU) is an important marker for the development and progression of renal disease, cardiovascular disease and death, but there is limited information about the prevalence and factors associated with confirmed PTU in predominantly white European HIV+ persons, especially in those with an estimated glomerular filtration rate (eGFR) of 60 mL/min/1.73 m(2). PATIENTS AND METHODS Baseline was defined as the first of two consecutive dipstick urine protein (DPU) measurements during prospective follow-up >1/6/2011 (when systematic data collection began). PTU was defined as two consecutive DUP >1+ (>30 mg/dL) >3 months apart; persons with eGFR <60 at either DPU measurement were excluded. Logistic regression investigated factors associated with PTU. RESULTS A total of 1,640 persons were included, participants were mainly white (n=1,517, 92.5%), male (n=1296, 79.0%) and men having sex with men (n=809; 49.3%). Median age at baseline was 45 (IQR 37-52 years), and CD4 was 570 (IQR 406-760/mm(3)). The median baseline date was 2/12 (IQR 11/11-6/12), and median eGFR was 99 (IQR 88-109 mL/min/1.73 m(2)). Sixty-nine persons had PTU (4.2%, 95% CI 3.2-4.7%). Persons with diabetes had increased odds of PTU, as were those with a prior non-AIDS (1) or AIDS event and those with prior exposure to indinavir. Among females, those with a normal eGFR (>90) and those with prior abacavir use had lower odds of PTU (Figure 1). CONCLUSIONS One in 25 persons with eGFR>60 had confirmed proteinuria at baseline. Factors associated with PTU were similar to those associated with CKD. The lack of association with antiretrovirals, particularly tenofovir, may be due to the cross-sectional design of this study, and additional follow-up is required to address progression to PTU in those without PTU at baseline. It may also suggest other markers are needed to capture the deteriorating renal function associated with antiretrovirals may be needed at higher eGFRs. Our findings suggest PTU is an early marker for impaired renal function.
Resumo:
INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.