922 resultados para Pattern of Use


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we assess the climate mitigation potential from afforestation in a mountainous snow-rich region (Switzerland) with strongly varying environmental conditions. Using radiative forcing calculations, we quantify both the carbon sequestration potential and the effect of albedo change at high resolution. We calculate the albedo radiative forcing based on remotely sensed data sets of albedo, global radiation and snow cover. Carbon sequestration is estimated from changes in carbon stocks based on national inventories. We first estimate the spatial pattern of radiative forcing (RF) across Switzerland assuming homogeneous transitions from open land to forest. This highlights where forest expansion still exhibits climatic benefits when including the radiative forcing of albedo change. Second, given that forest expansion is currently the dominant land-use change process in the Swiss Alps, we calculate the radiative forcing that occurred between 1985 and 1997. Our results show that the net RF of forest expansion ranges from −24 W m−2 at low elevations of the northern Prealps to 2 W m−2 at high elevations of the Central Alps. The albedo RF increases with increasing altitude, which offsets the CO2 RF at high elevations with long snow-covered periods, high global radiation and low carbon sequestration. Albedo RF is particularly relevant during transitions from open land to open forest but not in later stages of forest development. Between 1985 and 1997, when overall forest expansion in Switzerland was approximately 4%, the albedo RF offset the CO2 RF by an average of 40%. We conclude that the albedo RF should be considered at an appropriately high resolution when estimating the climatic effect of forestation in temperate mountainous regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE Metastatic renal cell carcinoma can be clinically diverse in terms of the pattern of metastatic disease and response to treatment. We studied the impact of metastasis and location on cancer specific survival. MATERIALS AND METHODS The records of 2,017 patients with renal cell cancer and tumor thrombus who underwent radical nephrectomy and tumor thrombectomy from 1971 to 2012 at 22 centers in the United States and Europe were analyzed. Number and location of synchronous metastases were compared with respect to patient cancer specific survival. Multivariable Cox regression models were used to quantify the impact of covariates. RESULTS Lymph node metastasis (155) or distant metastasis (725) was present in 880 (44%) patients. Of the patients with distant disease 385 (53%) had an isolated metastasis. The 5-year cancer specific survival was 51.3% (95% CI 48.6-53.9) for the entire group. On univariable analysis patients with isolated lymph node metastasis had a significantly worse cancer specific survival than those with a solitary distant metastasis. The location of distant metastasis did not have any significant effect on cancer specific survival. On multivariable analysis the presence of lymph node metastasis, isolated distant metastasis and multiple distant metastases were independently associated with cancer specific survival. Moreover higher tumor thrombus level, papillary histology and the use of postoperative systemic therapy were independently associated with worse cancer specific survival. CONCLUSIONS In our multi-institutional series of patients with renal cell cancer who underwent radical nephrectomy and tumor thrombectomy, almost half of the patients had synchronous lymph node or distant organ metastasis. Survival was superior in patients with solitary distant metastasis compared to isolated lymph node disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of our study is to compare the prevalence of illicit drug use estimated through a technique referred to as the “crosswise model” (CM) with the results from conventional direct questioning (DQ). Method: About 1,500 students from Tehran University of Medical Sciences 2009–2010 were first interviewed by DQ and, then three months later, by the CM. Result: The CM yielded significantly higher estimates than DQ for lifetime prevalence of use of any illicit drug (CM = 20.2%,DQ = 3.0%, p < .001) and for lifetime prevalence of use of opium or its residue (CM = 13.6%, DQ = 1.0%, p < .001). Also, for use of any illicit drug in the last month and use of opium or its residue in the last month, the CM yielded higher point estimates than DQ, although these differences were not significant (any drug: CM = 1.5%, DQ = 0.2%, p = .66; opium: CM = 3.8%, DQ = 0.0%, p = .21). Conclusion: Our findings suggest that the CM is a fruitful data collection method for sensitive topics such as substance abuse.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Contrast-enhanced diagnostic imaging techniques are considered useful in veterinary and human medicine to evaluate liver perfusion and focal hepatic lesions. Although hepatic diseases are a common occurrence in reptile medicine, there is no reference to the use of contrast-enhanced ultrasound (CEUS) and contrast-enhanced computed tomography (CECT) to evaluate the liver in lizards. Therefore, the aim of this study was to evaluate the pattern of change in echogenicity and attenuation of the liver in green iguanas (Iguana iguana) after administration of specific contrast media. RESULTS An increase in liver echogenicity and density was evident during CEUS and CECT, respectively. In CEUS, the mean ± SD (median; range) peak enhancement was 19.9% ± 7.5 (18.3; 11.7-34.6). Time to peak enhancement was 134.0 ± 125.1 (68.4; 59.6-364.5) seconds. During CECT, first visualization of the contrast medium was at 3.6 ± 0.5 (4; 3-4) seconds in the aorta, 10.7 ± 2.2 (10.5; 7-14) seconds in the hepatic arteries, and 15 ± 4.5 (14.5; 10-24) seconds in the liver parenchyma. Time to peak was 14.1 ± 3.4 (13; 11-21) and 31 ± 9.6 (29; 23-45) seconds in the aorta and the liver parenchyma, respectively. CONCLUSION CEUS and dynamic CECT are practical means to determine liver hemodynamics in green iguanas. Distribution of contrast medium in iguana differed from mammals. Specific reference ranges of hepatic perfusion for diagnostic evaluation of the liver in iguanas are necessary since the use of mammalian references may lead the clinician to formulate incorrect diagnostic suspicions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biologic agents (also termed biologicals or biologics) are therapeutics that are synthesized by living organisms and directed against a specific determinant, for example, a cytokine or receptor. In inflammatory and autoimmune diseases, biologicals have revolutionized the treatment of several immune-mediated disorders. Biologicals have also been tested in allergic disorders. These include agents targeting IgE; T helper 2 (Th2)-type and Th2-promoting cytokines, including interleukin-4 (IL-4), IL-5, IL-9, IL-13, IL-31, and thymic stromal lymphopoietin (TSLP); pro-inflammatory cytokines, such as IL-1β, IL-12, IL-17A, IL-17F, IL-23, and tumor necrosis factor (TNF); chemokine receptor CCR4; and lymphocyte surface and adhesion molecules, including CD2, CD11a, CD20, CD25, CD52, and OX40 ligand. In this task force paper of the Interest Group on Biologicals of the European Academy of Allergy and Clinical Immunology, we review biologicals that are currently available or tested for the use in various allergic and urticarial pathologies, by providing an overview on their state of development, area of use, adverse events, and future research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deglaciation history of the Swiss Alps after the Last Glacial Maximum involved the decay of several ice domes and the subsequent disintegration of valley glaciers at high altitude. Here we use bedrock exposure dating to reconstruct the temporal and spatial pattern of ice retreat at the Simplon Pass (altitude: ∼2000 m) located 40 km southwest of the ‘Rhône ice dome’. Eleven 10Be exposure ages from glacially polished quartz veins and ice-molded bedrock surfaces cluster tightly between 13.5 ± 0.6 ka and 15.4 ± 0.6 ka (internal errors) indicating that the Simplon Pass depression became ice-free at 14.1 ± 0.4 ka (external error of mean age). This age constraint is interpreted to record the melting of the high valley glaciers in the Simplon Pass region during the warm Bølling–Allerød interstadial shortly after the Oldest Dryas stadial. Two bedrock samples collected a few hundred meters above the pass depression yield older 10Be ages of 17.8 ± 0.6 ka and 18.0 ± 0.6 ka. These ages likely reflect the initial downwasting of the Rhône ice dome and the termination of the ice transfluence from the ice dome across the Simplon Pass toward the southern foreland. There, the retreat of the piedmont glacier in Val d’Ossola was roughly synchronous with the decay of the Rhône ice dome in the interior of the mountain belt, as shown by 10Be ages of 17.7 ± 0.9 ka and 16.1 ± 0.6 ka for a whaleback at ∼500 m elevation near Montecrestese in northern Italy. In combination with well-dated paleoclimate records derived from lake sediments, our new age data suggest that during the deglaciation of the European Alps the decay of ice domes was approximately synchronous with the retreat of piedmont glaciers in the foreland and was followed by the melting of high-altitude valley glaciers after the transition from the Oldest Dryas to the Bølling–Allerød, when mean annual temperatures rose rapidly by ∼3 °C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to evaluate whether the distribution pattern of early ischemic changes in the initial MRI allows a practical method for estimating leptomeningeal collateralization in acute ischemic stroke (AIS). METHODS Seventy-four patients with AIS underwent MRI followed by conventional angiogram and mechanical thrombectomy. Diffusion restriction in Diffusion weighted imaging (DWI) and correlated T2-hyperintensity of the infarct were retrospectively analyzed and subdivided in accordance with Alberta Stroke Program Early CT score (ASPECTS). Patients were angiographically graded in collateralization groups according to the method of Higashida, and dichotomized in 2 groups: 29 subjects with collateralization grade 3 or 4 (well-collateralized group) and 45 subjects with grade 1 or 2 (poorly-collateralized group). Individual ASPECTS areas were compared among the groups. RESULTS Means for overall DWI-ASPECTS were 6.34 vs. 4.51 (well vs. poorly collateralized groups respectively), and for T2-ASPECTS 9.34 vs 8.96. A significant difference between groups was found for DWI-ASPECTS (p<0.001), but not for T2-ASPECTS (p = 0.088). Regarding the individual areas, only insula, M1-M4 and M6 showed significantly fewer infarctions in the well-collateralized group (p-values <0.001 to 0.015). 89% of patients in the well-collateralized group showed 0-2 infarctions in these six areas (44.8% with 0 infarctions), while 59.9% patients of the poor-collateralized group showed 3-6 infarctions. CONCLUSION Patients with poor leptomeningeal collateralization show more infarcts on the initial MRI, particularly in the ASPECTS areas M1 to M4, M6 and insula. Therefore DWI abnormalities in these areas may be a surrogate marker for poor leptomeningeal collaterals and may be useful for estimation of the collateral status in routine clinical evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lipid resonances from mobile lipids can be observed by (1)H NMR spectroscopy in multiple tissues and have also been associated with malignancy. In order to use lipid resonances as a marker for disease, a reference standard from a healthy tissue has to be established taking the influence of variable factors like the spinning rate into account. The purpose of our study was to investigate the effect of spinning rate variation on the HR-MAS pattern of lipid resonances in non-neoplastic brain biopsies from different regions and visualize polar and non-polar lipids by fluorescence microscopy using Nile Red staining. (1)H HR-MAS NMR spectroscopy demonstrated higher lipid peak intensities in normal sheep brain pure white matter biopsies compared to mixed white and gray matter biopsies and pure gray matter biopsies. High spinning rates increased the visibility particularly of the methyl resonances at 1.3 and the methylene resonance at 0.89ppm in white matter biopsies stronger compared to thalamus and brainstem biopsies, and gray matter biopsies. The absence of lipid droplets and presence of a large number of myelin sheaths observed in white matter by Nile Red fluorescence microscopy suggest that the observed lipid resonances originate from the macromolecular pool of lipid protons of the myelin sheath's plasma membranes. When using lipid contents as a marker for disease, the variable behavior of lipid resonances in different neuroanatomical regions of the brain and at variable spinning rates should be considered. The findings may open up interesting possibilities for investigating lipids in myelin sheaths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frontal alpha band asymmetry (FAA) is a marker of altered reward processing in major depressive disorder (MDD), associated with reduced approach behavior and withdrawal. However, its association with brain metabolism remains unclear. The aim of this study is to investigate FAA and its correlation with resting – state cerebral blood flow (rCBF). We hypothesized an association of FAA with regional rCBF in brain regions relevant for reward processing and motivated behavior, such as the striatum. We enrolled 20 patients and 19 healthy subjects. FAA scores and rCBF were quantified with the use of EEG and arterial spin labeling. Correlations of the two were evaluated, as well as the association with FAA and psychometric assessments of motivated behavior and anhedonia. Patients showed a left – lateralized pattern of frontal alpha activity and a correlation of FAA lateralization with subscores of Hamilton Depression Rating Scale linked to motivated behavior. An association of rCBF and FAA scores was found in clusters in the dorsolateral prefrontal cortex bilaterally (patients) and in the left medial frontal gyrus, in the right caudate head and in the right inferior parietal lobule (whole group). No correlations were found in healthy controls. Higher inhibitory right – lateralized alpha power was associated with lower rCBF values in prefrontal and striatal regions, predominantly in the right hemisphere, which are involved in the processing of motivated behavior and reward. Inhibitory brain activity in the reward system may contribute to some of the motivational problems observed in MDD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During intertemporal decisions, the preference for smaller, sooner reward over larger-delayed rewards (temporal discounting, TD) exhibits substantial inter-subject variability; however, it is currently unclear what are the mechanisms underlying this apparently idiosyncratic behavior. To answer this question, here we recorded and analyzed mouse movement kinematics during intertemporal choices in a large sample of participants (N = 86). Results revealed a specific pattern of decision dynamics associated with the selection of “immediate” versus “delayed” response alternatives, which well discriminated between a “discounter” versus a “farsighted” behavior—thus representing a reliable behavioral marker of TD preferences. By fitting the Drift Diffusion Model to the data, we showed that differences between discounter and farsighted subjects could be explained in terms of different model parameterizations, corresponding to the use of different choice mechanisms in the two groups. While farsighted subjects were biased toward the “delayed” option, discounter subjects were not correspondingly biased toward the “immediate” option. Rather, as shown by the dynamics of evidence accumulation over time, their behavior was characterized by high choice uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Importance In treatment-resistant schizophrenia, clozapine is considered the standard treatment. However, clozapine use has restrictions owing to its many adverse effects. Moreover, an increasing number of randomized clinical trials (RCTs) of other antipsychotics have been published. Objective To integrate all the randomized evidence from the available antipsychotics used for treatment-resistant schizophrenia by performing a network meta-analysis. Data Sources MEDLINE, EMBASE, Biosis, PsycINFO, PubMed, Cochrane Central Register of Controlled Trials, World Health Organization International Trial Registry, and clinicaltrials.gov were searched up to June 30, 2014. Study Selection At least 2 independent reviewers selected published and unpublished single- and double-blind RCTs in treatment-resistant schizophrenia (any study-defined criterion) that compared any antipsychotic (at any dose and in any form of administration) with another antipsychotic or placebo. Data Extraction and Synthesis At least 2 independent reviewers extracted all data into standard forms and assessed the quality of all included trials with the Cochrane Collaboration's risk-of-bias tool. Data were pooled using a random-effects model in a Bayesian setting. Main Outcomes and Measures The primary outcome was efficacy as measured by overall change in symptoms of schizophrenia. Secondary outcomes included change in positive and negative symptoms of schizophrenia, categorical response to treatment, dropouts for any reason and for inefficacy of treatment, and important adverse events. Results Forty blinded RCTs with 5172 unique participants (71.5% men; mean [SD] age, 38.8 [3.7] years) were included in the analysis. Few significant differences were found in all outcomes. In the primary outcome (reported as standardized mean difference; 95% credible interval), olanzapine was more effective than quetiapine (-0.29; -0.56 to -0.02), haloperidol (-0. 29; -0.44 to -0.13), and sertindole (-0.46; -0.80 to -0.06); clozapine was more effective than haloperidol (-0.22; -0.38 to -0.07) and sertindole (-0.40; -0.74 to -0.04); and risperidone was more effective than sertindole (-0.32; -0.63 to -0.01). A pattern of superiority for olanzapine, clozapine, and risperidone was seen in other efficacy outcomes, but results were not consistent and effect sizes were usually small. In addition, relatively few RCTs were available for antipsychotics other than clozapine, haloperidol, olanzapine, and risperidone. The most surprising finding was that clozapine was not significantly better than most other drugs. Conclusions and Relevance Insufficient evidence exists on which antipsychotic is more efficacious for patients with treatment-resistant schizophrenia, and blinded RCTs-in contrast to unblinded, randomized effectiveness studies-provide little evidence of the superiority of clozapine compared with other second-generation antipsychotics. Future clozapine studies with high doses and patients with extremely treatment-refractory schizophrenia might be most promising to change the current evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE To analyze the indications and frequency for three-dimensional (3D) imaging for implant treatment planning in a pool of patients referred to a specialty clinic over a 3-year period. MATERIALS AND METHODS All patients who received dental implants between 2008 and 2010 at the Department of Oral Surgery and Stomatology at the University of Bern were included in the study. The influence of age, gender, and time of treatment (2008 to 2010) on the frequency of use of two-dimensional (2D) radiographic imaging modalities alone or in combination with 3D cone beam computed tomography (CBCT) scans was analyzed. Furthermore, the influence of the indication, location, and need for bone augmentation on the frequency of use of 2D imaging modalities alone or in combination with CBCT was evaluated. RESULTS In all, 1,568 patients (792 women and 776 men) received 2,279 implants. Overall, 633 patients (40.4%) were analyzed with 2D imaging procedures alone. CBCT was performed in 935 patients (59.6%). There was a statistically significant increase in CBCT between 2008 and 2010. Patients older than 55 years received a CBCT scan in addition to 2D radiographic imaging statistically significantly more often. Additional 3D imaging was most frequently performed in the posterior maxilla, whereas 2D radiographs alone exhibited the highest frequency in the anterior mandible. The combination of 2D with CBCT was used predominantly for implant placement with simultaneous or staged guided bone regeneration or sinus elevation. CONCLUSION Based on these findings from a specialty clinic, the use of additional CBCT imaging for implant treatment planning is influenced by the indication, location, local anatomy (including the need for bone augmentation), and the age of the patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When proposing primary control (changing the world to fit self)/secondary control (changing self to fit the world) theory, Weisz et al. (1984) argued for the importance of the “serenity to accept the things I cannot change, the courage to change the things I can” (p. 967), and the wisdom to choose the right control strategy that fits the context. Although the dual processes of control theory generated hundreds of empirical studies, most of them focused on the dichotomy of PC and SC, with none of these tapped into the critical concept: individuals’ ability to know when to use what. This project addressed this issue by using scenario questions to study the impact of situationally adaptive control strategies on youth well-being. To understand the antecedents of youths’ preference for PC or SC, we also connected PCSC theory with Dweck’s implicit theory about the changeability of the world. We hypothesized that youths’ belief about the world’s changeability impacts how difficult it was for them to choose situationally adaptive control orientation, which then impacts their well-being. This study included adolescents and emerging adults between the ages of 18 and 28 years (Mean = 20.87 years) from the US (n = 98), China (n = 100), and Switzerland (n = 103). Participants answered a questionnaire including a measure of implicit theories about the fixedness of the external world, a scenario-based measure of control orientation, and several measures of well-being. Preliminary analyses of the scenario-based control orientation measures showed striking cross-cultural similarity of preferred control responses: while for three of the six scenarios primary control was the predominately chosen control response in all cultures, for the other three scenarios secondary control was the predominately chosen response. This suggested that youths across cultures are aware that some situations call for primary control, while others demand secondary control. We considered the control strategy winning the majority of the votes to be the strategy that is situationally adaptive. The results of a multi-group structural equation mediation model with the extent of belief in a fixed world as independent variable, the difficulties of carrying out the respective adaptive versus non-adaptive control responses as two mediating variables and the latent well-being variable as dependent variable showed a cross-culturally similar pattern of effects: a belief in a fixed world was significantly related to higher difficulties in carrying out the normative as well as the non-normative control response, but only the difficulty of carrying out the normative control response (be it primary control in situations where primary control is normative or secondary control in situations where secondary control is normative) was significantly related to a lower reported well-being (while the difficulty of carrying out the non-normative response was unrelated to well-being). While previous research focused on cross-cultural differences on the choice of PC or SC, this study shed light on the universal necessity of applying the right kind of control to fit the situation.