925 resultados para Weighted histogram analysis method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the possibility of custom fitting a widely accepted approximate yield surface equation (Ziemian, 2000) to the theoretical yield surfaces of five different structural shapes, which include wide-flange, solid and hollow rectangular, and solid and hollow circular shapes. To achieve this goal, a theoretically “exact” but overly complex representation of the cross section’s yield surface was initially obtained by using fundamental principles of solid mechanics. A weighted regression analysis was performed with the “exact” yield surface data to obtain the specific coefficients of three terms in the approximate yield surface equation. These coefficients were calculated to determine the “best” yield surface equation for a given cross section geometry. Given that the exact yield surface shall have zero percentage of concavity, this investigation evaluated the resulting coefficient of determination (

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As lightweight and slender structural elements are more frequently used in the design, large scale structures become more flexible and susceptible to excessive vibrations. To ensure the functionality of the structure, dynamic properties of the occupied structure need to be estimated during the design phase. Traditional analysis method models occupants simply as an additional mass; however, research has shown that human occupants could be better modeled as an additional degree-of- freedom. In the United Kingdom, active and passive crowd models are proposed by the Joint Working Group as a result of a series of analytical and experimental research. It is expected that the crowd models would yield a more accurate estimation to the dynamic response of the occupied structure. However, experimental testing recently conducted through a graduate student project at Bucknell University indicated that the proposed passive crowd model might be inaccurate in representing the impact on the structure from the occupants. The objective of this study is to provide an assessment of the validity of the crowd models proposed by JWG through comparing the dynamic properties obtained from experimental testing data and analytical modeling results. The experimental data used in this study was collected by Firman in 2010. The analytical results were obtained by performing a time-history analysis on a finite element model of the occupied structure. The crowd models were created based on the recommendations from the JWG combined with the physical properties of the occupants during the experimental study. During this study, SAP2000 was used to create the finite element models and to implement the analysis; Matlab and ME¿scope were used to obtain the dynamic properties of the structure through processing the time-history analysis results from SAP2000. The result of this study indicates that the active crowd model could quite accurately represent the impact on the structure from occupants standing with bent knees while the passive crowd model could not properly simulate the dynamic response of the structure when occupants were standing straight or sitting on the structure. Future work related to this study involves improving the passive crowd model and evaluating the crowd models with full-scale structure models and operating data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are numerous statistical methods for quantitative trait linkage analysis in human studies. An ideal such method would have high power to detect genetic loci contributing to the trait, would be robust to non-normality in the phenotype distribution, would be appropriate for general pedigrees, would allow the incorporation of environmental covariates, and would be appropriate in the presence of selective sampling. We recently described a general framework for quantitative trait linkage analysis, based on generalized estimating equations, for which many current methods are special cases. This procedure is appropriate for general pedigrees and easily accommodates environmental covariates. In this paper, we use computer simulations to investigate the power robustness of a variety of linkage test statistics built upon our general framework. We also propose two novel test statistics that take account of higher moments of the phenotype distribution, in order to accommodate non-normality. These new linkage tests are shown to have high power and to be robust to non-normality. While we have not yet examined the performance of our procedures in the context of selective sampling via computer simulations, the proposed tests satisfy all of the other qualities of an ideal quantitative trait linkage analysis method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind power based generation has been rapidly growing world-wide during the recent past. In order to transmit large amounts of wind power over long distances, system planners may often add series compensation to existing transmission lines owing to several benefits such as improved steady-state power transfer limit, improved transient stability, and efficient utilization of transmission infrastructure. Application of series capacitors has posed resonant interaction concerns such as through subsynchronous resonance (SSR) with conventional turbine-generators. Wind turbine-generators may also be susceptible to such resonant interactions. However, not much information is available in literature and even engineering standards are yet to address these issues. The motivation problem for this research is based on an actual system switching event that resulted in undamped oscillations in a 345-kV series-compensated, typical ring-bus power system configuration. Based on time-domain ATP (Alternative Transients Program) modeling, simulations and analysis of system event records, the occurrence of subsynchronous interactions within the existing 345-kV series-compensated power system has been investigated. Effects of various small-signal and large-signal power system disturbances with both identical and non-identical wind turbine parameters (such as with a statistical-spread) has been evaluated. Effect of parameter variations on subsynchronous oscillations has been quantified using 3D-DFT plots and the oscillations have been identified as due to electrical self-excitation effects, rather than torsional interaction. Further, the generator no-load reactance and the rotor-side converter inner-loop controller gains have been identified as bearing maximum sensitivity to either damping or exacerbating the self-excited oscillations. A higher-order spectral analysis method based on modified Prony estimation has been successfully applied to the field records identifying dominant 9.79 Hz subsynchronous oscillations. Recommendations have been made for exploring countermeasures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid films, evaporating or non-evaporating, are ubiquitous in nature and technology. The dynamics of evaporating liquid films is a study applicable in several industries such as water recovery, heat exchangers, crystal growth, drug design etc. The theory describing the dynamics of liquid films crosses several fields such as engineering, mathematics, material science, biophysics and volcanology to name a few. Interfacial instabilities typically manifest by the undulation of an interface from a presumed flat state or by the onset of a secondary flow state from a primary quiescent state or both. To study the instabilities affecting liquid films, an evaporating/non-evaporating Newtonian liquid film is subject to a perturbation. Numerical analysis is conducted on configurations of such liquid films being heated on solid surfaces in order to examine the various stabilizing and destabilizing mechanisms that can cause the formation of different convective structures. These convective structures have implications towards heat transfer that occurs via this process. Certain aspects of this research topic have not received attention, as will be obvious from the literature review. Static, horizontal liquid films on solid surfaces are examined for their resistance to long wave type instabilities via linear stability analysis, method of normal modes and finite difference methods. The spatiotemporal evolution equation, available in literature, describing the time evolution of a liquid film heated on a solid surface, is utilized to analyze various stabilizing/destabilizing mechanisms affecting evaporating and non-evaporating liquid films. The impact of these mechanisms on the film stability and structure for both buoyant and non-buoyant films will be examined by the variation of mechanical and thermal boundary conditions. Films evaporating in zero gravity are studied using the evolution equation. It is found that films that are stable to long wave type instabilities in terrestrial gravity are prone to destabilization via long wave instabilities in zero gravity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even though complete resection is regarded as the only curative treatment for nonsmall cell lung cancer (NSCLC), >50% of resected patients die from a recurrence or a second primary tumour of the lung within 5 yrs. It remains unclear, whether follow-up in these patients is cost-effective and whether it can improve the outcome due to early detection of recurrent tumour. The benefit of regular follow-up in a consecutive series of 563 patients, who had undergone potentially curative resection for NSCLC at the University Hospital, was analysed. The follow-up consisted of clinical visits and chest radiography according to a standard protocol for up to 10 yrs. Survival rates were estimated using the Kaplan-Meier analysis method and the cost-effectiveness of the follow-up programme was assessed. A total of 23 patients (6.4% of the group with lobectomy) underwent further operation with curative intent for a second pulmonary malignancy. The regular follow-up over a 10-yr period provided the chance for a second curative treatment to 3.8% of all patients. The calculated costs per life-yr gained were 90,000 Swiss Francs. The cost-effectiveness of the follow-up protocol was far above those of comparable large-scale surveillance programmes. Based on these data, the intensity and duration of the follow-up was reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequency-transformed EEG resting data has been widely used to describe normal and abnormal brain functional states as function of the spectral power in different frequency bands. This has yielded a series of clinically relevant findings. However, by transforming the EEG into the frequency domain, the initially excellent time resolution of time-domain EEG is lost. The topographic time-frequency decomposition is a novel computerized EEG analysis method that combines previously available techniques from time-domain spatial EEG analysis and time-frequency decomposition of single-channel time series. It yields a new, physiologically and statistically plausible topographic time-frequency representation of human multichannel EEG. The original EEG is accounted by the coefficients of a large set of user defined EEG like time-series, which are optimized for maximal spatial smoothness and minimal norm. These coefficients are then reduced to a small number of model scalp field configurations, which vary in intensity as a function of time and frequency. The result is thus a small number of EEG field configurations, each with a corresponding time-frequency (Wigner) plot. The method has several advantages: It does not assume that the data is composed of orthogonal elements, it does not assume stationarity, it produces topographical maps and it allows to include user-defined, specific EEG elements, such as spike and wave patterns. After a formal introduction of the method, several examples are given, which include artificial data and multichannel EEG during different physiological and pathological conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Identification of the ventrointermediate thalamic nucleus (Vim) in modern 3T high-field MRI for image-based targeting in deep brain stimulation (DBS) is still challenging. To evaluate the usefulness and reliability of analyzing the connectivity with the cerebellum using Q-ball-calculation we performed a retrospective analysis. Method: 5 patients who underwent bilateral implantation of electrodes in the Vim for treatment of Essential Tremor between 2011 and 2012 received additional preoperative Q-ball imaging. Targeting was performed according to atlas coordinates and standard MRI. Additionally we performed a retrospective identification of the Vim by analyzing the connectivity of the thalamus with the dentate nucleus. The exact position of the active stimulation contact in the postoperative CT was correlated with the Vim as it was identified by Q-ball calculation. Results: Localization of the Vim by analysis of the connectivity between thalamus and cerebellum was successful in all 5 patients on both sides. The average position of the active contacts was 14.6 mm (SD 1.24) lateral, 5.37 mm (SD 0.094 posterior and 2.21 mm (SD 0.69) cranial of MC. The cranial portion of the dentato-rubro-thalamic tract was localized an average of 3.38 mm (SD 1.57) lateral and 1.5 mm (SD 1.22) posterior of the active contact. Conclusions: Connectivity analysis by Q-ball calculation provided direct visualization of the Vim in all cases. Our preliminary results suggest, that the target determined by connectivity analysis is valid and could possibly be used in addition to or even instead of atlas based targeting. Larger prospective calculations are needed to determine the robustness of this method in providing refined information useful for neurosurgical treatment of tremor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1 Natural soil profiles may be interpreted as an arrangement of parts which are characterized by properties like hydraulic conductivity and water retention function. These parts form a complicated structure. Characterizing the soil structure is fundamental in subsurface hydrology because it has a crucial influence on flow and transport and defines the patterns of many ecological processes. We applied an image analysis method for recognition and classification of visual soil attributes in order to model flow and transport through a man-made soil profile. Modeled and measured saturation-dependent effective parameters were compared. We found that characterizing and describing conductivity patterns in soils with sharp conductivity contrasts is feasible. Differently, solving flow and transport on the basis of these conductivity maps is difficult and, in general, requires special care for representation of small-scale processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The tropical region is an area of maximum humidity and serves as the major humidity source of the globe. Among other phenomena, it is governed by the so-called Inter-Tropical Convergence Zone (ITCZ) which is commonly defined by converging low-level winds or enhanced precipitation. Given its importance as a humidity source, we investigate the humidity fields in the tropics in different reanalysis data sets, deduce the climatology and variability and assess the relationship to the ITCZ. Therefore, a new analysis method of the specific humidity distribution is introduced which allows detecting the location of the humidity maximum, the strength and the meridional extent. The results show that the humidity maximum in boreal summer is strongly shifted northward over the warm pool/Asia Monsoon area and the Gulf of Mexico. These shifts go along with a peak in the strength in both areas; however, the extent shrinks over the warm pool/Asia Monsoon area, whereas it is wider over the Gulf of Mexico. In winter, such connections between location, strength and extent are not found. Still, a peak in strength is again identified over the Gulf of Mexico in boreal winter. The variability of the three characteristics is dominated by inter-annual signals in both seasons. The results using ERA-interim data suggest a positive trend in the Gulf of Mexico/Atlantic region from 1979 to 2010, showing an increased northward shift in the recent years. Although the trend is only weakly confirmed by the results using MERRA reanalysis data, it is in phase with a trend in hurricane activity�a possible hint of the importance of the new method on hurricanes. Furthermore, the position of the maximum humidity coincides with one of the ITCZ in most areas. One exception is the western and central Pacific, where the area is dominated by the double ITCZ in boreal winter. Nevertheless, the new method enables us to gain more insight into the humidity distribution, its variability and the relationship to ITCZ characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE We sought to evaluate potential reasons given by board-certified doctors for the persistence of adverse events despite efforts to improve patient safety in Switzerland. SUMMARY BACKGROUND DATA In recent years, substantial efforts have been made to improve patient safety by introducing surgical safety checklists to standardise surgeries and team procedures. Still, a high number of adverse events remain. METHODS Clinic directors in operative medicine in Switzerland were asked to answer two questions concerning the reasons for persistence of adverse events, and the advantages and disadvantages of introducing and implementing surgical safety checklists. Of 799 clinic directors, the arguments of 237 (29.7%) were content-analysed using Mayring's content analysis method, resulting in 12 different categories. RESULTS Potential reasons for the persistence of adverse events were mainly seen as being related to the "individual" (126/237, 53.2%), but directors of high-volume clinics identified factors related to the "group and interactions" significantly more often as a reason (60.2% vs 40.2%; p = 0.003). Surgical safety checklists were thought to have positive effects on the "organisational level" (47/237, 19.8%), the "team level" (37/237, 15.6%) and the "patient level" (40/237, 16.9%), with a "lack of willingness to implement checklists" as the main disadvantage (34/237, 14.3%). CONCLUSION This qualitative study revealed the individual as the main player in the persistence of adverse events. Working conditions should be optimised to minimise interface problems in the case of cross-covering of patients, to assure support for students, residents and interns, and to reduce strain. Checklists are helpful on an "organisational level" (e.g., financial benefits, quality assurance) and to clarify responsibilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a phase I clinical trial, six multiple myeloma patients, who were non-responsive to conventional therapy and were scheduled for bone marrow transplantation, received Holmium-166 ($\sp{166}$Ho) labeled to a bone seeking agent, DOTMP (1,4,7,10-tetraazacyclododecane-1,4,7,10-tetramethylene-phosphonic acid), for the purpose of bone marrow ablation. The specific aims of my research within this protocol were to evaluate the toxicity and efficacy of $\sp{166}$Ho DOTMP by quantifying the in vivo pharmacokinetics and radiation dosimetry, and by correlating these results to the biologic response observed. The reproducibility of pharmacokinetics from multiple injections of $\sp{166}$Ho DOTMP administered to these myeloma patients was demonstrated from both blood and whole body retention. The skeletal concentration of $\sp{166}$Ho DOTMP was heterogenous in all six patients: high in the ribs, pelvis, and lumbar vertebrae regions, and relatively low in the femurs, arms, and head.^ A novel technique was developed to calculate the radiation dose to the bone marrow in each skeletal ROI, and was applied to all six $\sp{166}$Ho DOTMP patients. Radiation dose estimates for the bone marrow calculated using the standard MIRD "S" factors were compared with the average values derived from the heterogenous distribution of activity in the skeleton (i.e., the regional technique). The results from the two techniques were significantly different; the average of the dose estimates from the regional technique were typically 30% greater. Furthermore, the regional technique provided a range of radiation doses for the entire marrow volume, while the MIRD "S" factors only provided a single value. Dose volume histogram analysis of data from the regional technique indicated a range of dose estimates that varied by a factor of 10 between the high dose and low dose regions. Finally, the observed clinical response of cells and abnormal proteins measured in bone marrow aspirates and peripheral blood samples were compared with radiation dose estimates for the bone marrow calculated from the standard and regional technique. The results showed the regional technique values correlated more closely to several clinical response parameters. (Abstract shortened by UMI.) ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM MRI and PET with 18F-fluoro-ethyl-tyrosine (FET) have been increasingly used to evaluate patients with gliomas. Our purpose was to assess the additive value of MR spectroscopy (MRS), diffusion imaging and dynamic FET-PET for glioma grading. PATIENTS, METHODS 38 patients (42 ± 15 aged, F/M: 0.46) with untreated histologically proven brain gliomas were included. All underwent conventional MRI, MRS, diffusion sequences, and FET-PET within 3±4 weeks. Performances of tumour FET time-activity-curve, early-to-middle SUVmax ratio, choline / creatine ratio and ADC histogram distribution pattern for gliomas grading were assessed, as compared to histology. Combination of these parameters and respective odds were also evaluated. RESULTS Tumour time-activity-curve reached the best accuracy (67%) when taken alone to distinguish between low and high-grade gliomas, followed by ADC histogram analysis (65%). Combination of time-activity-curve and ADC histogram analysis improved the sensitivity from 67% to 86% and the specificity from 63-67% to 100% (p < 0.008). On multivariate logistic regression analysis, negative slope of the tumour FET time-activity-curve however remains the best predictor of high-grade glioma (odds 7.6, SE 6.8, p = 0.022). CONCLUSION Combination of dynamic FET-PET and diffusion MRI reached good performance for gliomas grading. The use of FET-PET/MR may be highly relevant in the initial assessment of primary brain tumours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT : BACKGROUND : We consider how representations of geographic variation in prostate cancer incidence across Southern New England, USA may be affected by selection of study area and/or properties of the statistical analysis. METHOD : A spatial scan statistic was used to monitor geographic variation among 35,167 incident prostate cancer cases diagnosed in Massachusetts, Connecticut and Rhode Island from 1994 to 1998, in relation to the 1990 populations of men 20+ years of age living in that region. Results from the combined-states analysis were compared to those from single-states. Impact of scanning procedures set to examine up to 50% or no more than10% of at-risk populations also was evaluated. RESULTS : With scanning set to 50%, 5 locations in the combined-states analysis were identified with markedly distinct incidence rates. Fewer than expected cases were estimated for nearly all Connecticut, Rhode Island and West Central Massachusetts, whereas census tracts on and around Cape Cod, and areas of Southwestern Connecticut and adjacent to greater Boston were estimated to have yielded more than expected incidence. Results of single-state analyses exhibited several discrepancies from the combined-states analysis. More conservative scanning found many more locations with varying incidence, but discrepancies between the combined- and single-state analysis were fewer. CONCLUSION : It is important to acknowledge the conditional nature of spatial analyses and carefully consider whether a true cluster of events is identified or artifact stemming from selection of study area size and/or scanning properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to find out if there is a significant difference in using NDVI dataset processed by harmonic analysis method to evaluate its dynamic and response to climate change, compared with the original data.