932 resultados para Load test on SPT sampler
Resumo:
Introduction: The term Clinimetric was introduced by Feinstein in 1982, who first noticed that despite all the improvements in the assessment methods, a number of clinical phenomena were still unconsidered during the evaluation process. Yet today clinical phenomena, such as stress, relevant in diseases progression and course, are not completely evaluated. Only recently, according to the clinimetric approach, Fava and colleagues have introduced specific criteria for evaluating the allostatic overload in clinical setting. Methods: Participants were 240 blood donors recruited from May 2007 to December 2009 in 4 different blood Centers (AVIS) in Italy. Blood samples from each participant were collected for laboratory test the same day the self-rating instruments were administered (Psychosocial Index, Symptom Questionnaire, Psychological well-being scales, Temperament and Character inventory, Self-Report Altruism scale). The study explore different aspects describing sample characteristics and correlates of stress in the total sample (part I), new selection criteria applied to existing instruments to identify individuals reporting allostatic load (part II), and differences on biological correlates between subjects with vs without AL. Results: Significant differences according to gender and past illnesses have been found in different dimensions of well-being and distress. Further, distress was explained for more than 60% by 4 main factors such as anxiety, somatic symptoms, environmental mastery and persistence. According to the new criteria, 98 donors reported AL. Allostatic load individuals reported to engage in less altruistic behaviours. Also they differ in personality traits and characters from controls. In the last part, results showed significant differences among donors according to allostatic load on diverse biological parameters (RBC, MCV, immune essay). Conclusion: This study presents obvious limitations due to its preliminary nature. Further research are need to confirm that these new criteria may lead to identify high risk individuals reporting not only stressful situations but also vulnerabilities.
Resumo:
The subject of the presented thesis is the accurate measurement of time dilation, aiming at a quantitative test of special relativity. By means of laser spectroscopy, the relativistic Doppler shifts of a clock transition in the metastable triplet spectrum of ^7Li^+ are simultaneously measured with and against the direction of motion of the ions. By employing saturation or optical double resonance spectroscopy, the Doppler broadening as caused by the ions' velocity distribution is eliminated. From these shifts both time dilation as well as the ion velocity can be extracted with high accuracy allowing for a test of the predictions of special relativity. A diode laser and a frequency-doubled titanium sapphire laser were set up for antiparallel and parallel excitation of the ions, respectively. To achieve a robust control of the laser frequencies required for the beam times, a redundant system of frequency standards consisting of a rubidium spectrometer, an iodine spectrometer, and a frequency comb was developed. At the experimental section of the ESR, an automated laser beam guiding system for exact control of polarisation, beam profile, and overlap with the ion beam, as well as a fluorescence detection system were built up. During the first experiments, the production, acceleration and lifetime of the metastable ions at the GSI heavy ion facility were investigated for the first time. The characterisation of the ion beam allowed for the first time to measure its velocity directly via the Doppler effect, which resulted in a new improved calibration of the electron cooler. In the following step the first sub-Doppler spectroscopy signals from an ion beam at 33.8 %c could be recorded. The unprecedented accuracy in such experiments allowed to derive a new upper bound for possible higher-order deviations from special relativity. Moreover future measurements with the experimental setup developed in this thesis have the potential to improve the sensitivity to low-order deviations by at least one order of magnitude compared to previous experiments; and will thus lead to a further contribution to the test of the standard model.
Resumo:
English: The assessment of safety in existing bridges and viaducts led the Ministry of Public Works of the Netherlands to finance a specific campaing aimed at the study of the response of the elements of these infrastructures. Therefore, this activity is focused on the investigation of the behaviour of reinforced concrete slabs under concentrated loads, adopting finite element modeling and comparison with experimental results. These elements are characterized by shear behaviour and crisi, whose modeling is, from a computational point of view, a hard challeng, due to the brittle behavior combined with three-dimensional effects. The numerical modeling of the failure is studied through Sequentially Linear Analysis (SLA), an alternative Finite Element method, with respect to traditional incremental and iterative approaches. The comparison between the two different numerical techniques represents one of the first works and comparisons in a three-dimensional environment. It's carried out adopting one of the experimental test executed on reinforced concrete slabs as well. The advantage of the SLA is to avoid the well known problems of convergence of typical non-linear analysis, by directly specifying a damage increment, in terms of reduction of stiffness and resistance in particular finite element, instead of load or displacement increasing on the whole structure . For the first time, particular attention has been paid to specific aspects of the slabs, like an accurate constraints modeling and sensitivity of the solution with respect to the mesh density. This detailed analysis with respect to the main parameters proofed a strong influence of the tensile fracture energy, mesh density and chosen model on the solution in terms of force-displacement diagram, distribution of the crack patterns and shear failure mode. The SLA showed a great potential, but it requires a further developments for what regards two aspects of modeling: load conditions (constant and proportional loads) and softening behaviour of brittle materials (like concrete) in the three-dimensional field, in order to widen its horizons in these new contexts of study.
Resumo:
Il primo studio ha verificato l'affidabilità del software Polimedicus e gli effetti indotti d'allenamento arobico all’intensità del FatMax. 16 soggetti sovrappeso, di circa 40-55anni, sono stati arruolati e sottoposti a un test incrementale fino a raggiungere un RER di 0,95, e da quel momento il carico è stato aumentato di 1 km/ h ogni minuto fino a esaurimento. Successivamente, è stato verificato se i valori estrapolati dal programma erano quelli che si possono verificare durante a un test a carico costante di 1ora. I soggetti dopo 8 settimane di allenamento hanno fatto un altro test incrementale. Il dati hanno mostrato che Polimedicus non è molto affidabile, soprattutto l'HR. Nel secondo studio è stato sviluppato un nuovo programma, Inca, ed i risultati sono stati confrontati con i dati ottenuti dal primo studio con Polimedicus. I risultati finali hanno mostrato che Inca è più affidabile. Nel terzo studio, abbiamo voluto verificare l'esattezza del calcolo del FatMax con Inca e il test FATmaxwork. 25 soggetti in sovrappeso, tra 40-55 anni, sono stati arruolati e sottoposti al FATmaxwork test. Successivamente, è stato verificato se i valori estrapolati da INCA erano quelli che possono verificarsi durante un carico di prova costante di un'ora. L'analisi ha mostrato una precisione del calcolo della FatMax durante il carico di lavoro. Conclusione: E’ emersa una certa difficoltà nel determinare questo parametro, sia per la variabilità inter-individuale che intra-individuale. In futuro bisognerà migliorare INCA per ottenere protocolli di allenamento ancora più validi.
Resumo:
Constant developments in the field of offshore wind energy have increased the range of water depths at which wind farms are planned to be installed. Therefore, in addition to monopile support structures suitable in shallow waters (up to 30 m), different types of support structures, able to withstand severe sea conditions at the greater water depths, have been developed. For water depths above 30 m, the jacket is one of the preferred support types. Jacket represents a lightweight support structure, which, in combination with complex nature of environmental loads, is prone to highly dynamic behavior. As a consequence, high stresses with great variability in time can be observed in all structural members. The highest concentration of stresses occurs in joints due to their nature (structural discontinuities) and due to the existence of notches along the welds present in the joints. This makes them the weakest elements of the jacket in terms of fatigue. In the numerical modeling of jackets for offshore wind turbines, a reduction of local stresses at the chord-brace joints, and consequently an optimization of the model, can be achieved by implementing joint flexibility in the chord-brace joints. Therefore, in this work, the influence of joint flexibility on the fatigue damage in chord-brace joints of a numerical jacket model, subjected to advanced load simulations, is studied.
Resumo:
The main objective of this thesis is to obtain a better understanding of the methods to assess the stability of a slope. We have illustrated the principal variants of the Limit Equilibrium (LE) method found in literature, focalizing our attention on the Minimum Lithostatic Deviation (MLD) method, developed by Prof. Tinti and his collaborators (e.g. Tinti and Manucci, 2006, 2008). We had two main goals: the first was to test the MLD method on some real cases. We have selected the case of the Vajont landslide with the objective to reconstruct the conditions that caused the destabilization of Mount Toc, and two sites in the Norwegian margin, where failures has not occurred recently, with the aim to evaluate the present stability state and to assess under which conditions they might be mobilized. The second goal was to study the stability charts by Taylor and by Michalowski, and to use the MLD method to investigate the correctness and adequacy of this engineering tool.
Resumo:
Seismic assessment and seismic strengthening are the key issues need to be figured out during the process of protection and reusing of historical buildings. In this thesis the seismic behaviors of the hinged steel structure, a typical structure of historical buildings, i.e. hinged steel frames in Shanghai, China, were studied based on experimental investigations and theoretic analysis. How the non-structural members worked with the steel frames was analyzed thoroughly. Firstly, two 1/4 scale hinged steel frames were constructed based on the structural system of Bund 18, a historical building in Shanghai: M1 model without infill walls, M2 model with infill walls, and tested under the horizontal cyclic loads to investigate their seismic behavior. The Shaking Table Test and its results indicated that the seismic behavior of the hinged steel frames could be improved significantly with the help of non-structural members, i.e., surrounding elements outside the hinged steel frames and infilled walls. To specify, the columns are covered with bricks, they consist of I shape formed steel sections and steel plates, which are clenched together. The steel beams are connected to the steel column by steel angle, thus the structure should be considered as a hinged frame. And the infilled wall acted as a compression diagonal strut to withstand the horizontal load, therefore, the seismic capacity and stiffness of the hinged steel frames with infilled walls could be estimated by using the equivalent compression diagonal strut model. A SAP model has been constructed with the objective to perform a dynamic nonlinear analysis. The obtained results were compared with the results obtained from Shaking Table Test. The Test Results have validated that the influence of infill walls on seismic behavior can be estimated by using the equivalent diagonal strut model.
Resumo:
This study aimed to determine if Alzheimer caregivers have increased allostatic load compared to non-caregivers. Potential psychological moderators (mastery, depression, and role overload) of the relationship between caregiving status and allostatic load were also explored. Eighty-seven caregivers and 43 non-caregivers underwent biological assessment of allostatic load and psychological assessments. Caregivers had significantly higher allostatic load compared to non-caregivers ( p < .05). Mastery, but not depression or overload, moderated the relationship between caregiving status and allostatic load. In conclusion, allostatic load may represent a link explaining how stress translates to downstream pathology, but more work is necessary to understand the role of psychological factors.
Resumo:
Different codons encoding the same amino acid are not used equally in protein-coding sequences. In bacteria, there is a bias towards codons with high translation rates. This bias is most pronounced in highly expressed proteins, but a recent study of synthetic GFP-coding sequences did not find a correlation between codon usage and GFP expression, suggesting that such correlation in natural sequences is not a simple property of translational mechanisms. Here, we investigate the effect of evolutionary forces on codon usage. The relation between codon bias and protein abundance is quantitatively analyzed based on the hypothesis that codon bias evolved to ensure the efficient usage of ribosomes, a precious commodity for fast growing cells. An explicit fitness landscape is formulated based on bacterial growth laws to relate protein abundance and ribosomal load. The model leads to a quantitative relation between codon bias and protein abundance, which accounts for a substantial part of the observed bias for E. coli. Moreover, by providing an evolutionary link, the ribosome load model resolves the apparent conflict between the observed relation of protein abundance and codon bias in natural sequences and the lack of such dependence in a synthetic gfp library. Finally, we show that the relation between codon usage and protein abundance can be used to predict protein abundance from genomic sequence data alone without adjustable parameters.
Resumo:
Different codons encoding the same amino acid are not used equally in protein-coding sequences. In bacteria, there is a bias towards codons with high translation rates. This bias is most pronounced in highly expressed proteins, but a recent study of synthetic GFP-coding sequences did not find a correlation between codon usage and GFP expression, suggesting that such correlation in natural sequences is not a simple property of translational mechanisms. Here, we investigate the effect of evolutionary forces on codon usage. The relation between codon bias and protein abundance is quantitatively analyzed based on the hypothesis that codon bias evolved to ensure the efficient usage of ribosomes, a precious commodity for fast growing cells. An explicit fitness landscape is formulated based on bacterial growth laws to relate protein abundance and ribosomal load. The model leads to a quantitative relation between codon bias and protein abundance, which accounts for a substantial part of the observed bias for E. coli. Moreover, by providing an evolutionary link, the ribosome load model resolves the apparent conflict between the observed relation of protein abundance and codon bias in natural sequences and the lack of such dependence in a synthetic gfp library. Finally, we show that the relation between codon usage and protein abundance can be used to predict protein abundance from genomic sequence data alone without adjustable parameters.
Resumo:
Objectives To determine the improvement in positive predictive value of immunological failure criteria for identifying virological failure in HIV-infected children on antiretroviral therapy (ART) when a single targeted viral load measurement is performed in children identified as having immunological failure. Methods Analysis of data from children (<16 years at ART initiation) at South African ART sites at which CD4 count/per cent and HIV-RNA monitoring are performed 6-monthly. Immunological failure was defined according to both WHO 2010 and United States Department of Health and Human Services (DHHS) 2008 criteria. Confirmed virological failure was defined as HIV-RNA >5000 copies/ml on two consecutive occasions <365 days apart in a child on ART for ≥18 months. Results Among 2798 children on ART for ≥18 months [median (IQR) age 50 (21-84) months at ART initiation], the cumulative probability of confirmed virological failure by 42 months on ART was 6.3%. Using targeted viral load after meeting DHHS immunological failure criteria rather than DHHS immunological failure criteria alone increased positive predictive value from 28% to 82%. Targeted viral load improved the positive predictive value of WHO 2010 criteria for identifying confirmed virological failure from 49% to 82%. Conclusion The addition of a single viral load measurement in children identified as failing immunologically will prevent most switches to second-line treatment in virologically suppressed children.
Resumo:
Background Most adults infected with HIV achieve viral suppression within a year of starting combination antiretroviral therapy (cART). It is important to understand the risk of AIDS events or death for patients with a suppressed viral load. Methods and Findings Using data from the Collaboration of Observational HIV Epidemiological Research Europe (2010 merger), we assessed the risk of a new AIDS-defining event or death in successfully treated patients. We accumulated episodes of viral suppression for each patient while on cART, each episode beginning with the second of two consecutive plasma viral load measurements <50 copies/µl and ending with either a measurement >500 copies/µl, the first of two consecutive measurements between 50–500 copies/µl, cART interruption or administrative censoring. We used stratified multivariate Cox models to estimate the association between time updated CD4 cell count and a new AIDS event or death or death alone. 75,336 patients contributed 104,265 suppression episodes and were suppressed while on cART for a median 2.7 years. The mortality rate was 4.8 per 1,000 years of viral suppression. A higher CD4 cell count was always associated with a reduced risk of a new AIDS event or death; with a hazard ratio per 100 cells/µl (95% CI) of: 0.35 (0.30–0.40) for counts <200 cells/µl, 0.81 (0.71–0.92) for counts 200 to <350 cells/µl, 0.74 (0.66–0.83) for counts 350 to <500 cells/µl, and 0.96 (0.92–0.99) for counts ≥500 cells/µl. A higher CD4 cell count became even more beneficial over time for patients with CD4 cell counts <200 cells/µl. Conclusions Despite the low mortality rate, the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression. A higher CD4 cell count was associated with the greatest benefit for patients with a CD4 cell count <200 cells/µl but still some slight benefit for those with a CD4 cell count ≥500 cells/µl.