943 resultados para Fourth order method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coupled-cluster (CC) theory is one of the most successful approaches in high-accuracy quantum chemistry. The present thesis makes a number of contributions to the determination of molecular properties and excitation energies within the CC framework. The multireference CC (MRCC) method proposed by Mukherjee and coworkers (Mk-MRCC) has been benchmarked within the singles and doubles approximation (Mk-MRCCSD) for molecular equilibrium structures. It is demonstrated that Mk-MRCCSD yields reliable results for multireference cases where single-reference CC methods fail. At the same time, the present work also illustrates that Mk-MRCC still suffers from a number of theoretical problems and sometimes gives rise to results of unsatisfactory accuracy. To determine polarizability tensors and excitation spectra in the MRCC framework, the Mk-MRCC linear-response function has been derived together with the corresponding linear-response equations. Pilot applications show that Mk-MRCC linear-response theory suffers from a severe problem when applied to the calculation of dynamic properties and excitation energies: The Mk-MRCC sufficiency conditions give rise to a redundancy in the Mk-MRCC Jacobian matrix, which entails an artificial splitting of certain excited states. This finding has established a new paradigm in MRCC theory, namely that a convincing method should not only yield accurate energies, but ought to allow for the reliable calculation of dynamic properties as well. In the context of single-reference CC theory, an analytic expression for the dipole Hessian matrix, a third-order quantity relevant to infrared spectroscopy, has been derived and implemented within the CC singles and doubles approximation. The advantages of analytic derivatives over numerical differentiation schemes are demonstrated in some pilot applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il telerilevamento rappresenta un efficace strumento per il monitoraggio dell’ambiente e del territorio, grazie alla disponibilità di sensori che riprendono con cadenza temporale fissa porzioni della superficie terrestre. Le immagini multi/iperspettrali acquisite sono in grado di fornire informazioni per differenti campi di applicazione. In questo studio è stato affrontato il tema del consumo di suolo che rappresenta un’importante sfida per una corretta gestione del territorio, poiché direttamente connesso con i fenomeni del runoff urbano, della frammentazione ecosistemica e con la sottrazione di importanti territori agricoli. Ancora non esiste una definizione unica, ed anche una metodologia di misura, del consumo di suolo; in questo studio è stato definito come tale quello che provoca impermeabilizzazione del terreno. L’area scelta è quella della Provincia di Bologna che si estende per 3.702 km2 ed è caratterizzata a nord dalla Pianura Padana e a sud dalla catena appenninica; secondo i dati forniti dall’ISTAT, nel periodo 2001-2011 è stata la quarta provincia in Italia con più consumo di suolo. Tramite classificazione pixel-based è stata fatta una mappatura del fenomeno per cinque immagini Landsat. Anche se a media risoluzione, e quindi non in grado di mappare tutti i dettagli, esse sono particolarmente idonee per aree estese come quella scelta ed inoltre garantiscono una più ampia copertura temporale. Il periodo considerato va dal 1987 al 2013 e, tramite procedure di change detection applicate alle mappe prodotte, si è cercato di quantificare il fenomeno, confrontarlo con i dati esistenti e analizzare la sua distribuzione spaziale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since its discovery, top quark has represented one of the most investigated field in particle physics. The aim of this thesis is the reconstruction of hadronic top with high transverse momentum (boosted) with the Template Overlap Method (TOM). Because of the high energy, the decay products of boosted tops are partially or totally overlapped and thus they are contained in a single large radius jet (fat-jet). TOM compares the internal energy distributions of the candidate fat-jet to a sample of tops obtained by a MC simulation (template). The algorithm is based on the definition of an overlap function, which quantifies the level of agreement between the fat-jet and the template, allowing an efficient discrimination of signal from the background contributions. A working point has been decided in order to obtain a signal efficiency close to 90% and a corresponding background rejection at 70%. TOM performances have been tested on MC samples in the muon channel and compared with the previous methods present in literature. All the methods will be merged in a multivariate analysis to give a global top tagging which will be included in ttbar production differential cross section performed on the data acquired in 2012 at sqrt(s)=8 TeV in high phase space region, where new physics processes could be possible. Due to its peculiarity to increase the pT, the Template Overlap Method will play a crucial role in the next data taking at sqrt(s)=13 TeV, where the almost totality of the tops will be produced at high energy, making the standard reconstruction methods inefficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Arbeit beschäftigt sich mit der Modellierung niederenergetischer elektromagnetischer und hadronischer Prozesse im Rahmen einer manifest lorentzinvarianten, chiralen effektiven Feldtheorie unter expliziter, dynamischer Berücksichtigung resonanter, das heißt vektormesonischer Freiheitsgrade. Diese effektive Theorie kann daher als Approximation der grundlegenden Quantenchromodynamik bei kleinen Energien verstanden werden. Besonderes Augenmerk wird dabei auf das verwendete Zähl- sowie Renormierungschema gelegt, wodurch eine konsistente Beschreibung mesonischer Prozesse bis zu Energien von etwa 1GeV ermöglicht wird. Das verwendete Zählschema beruht dabei im Wesentlichen auf einem Argument für großes N_c (Anzahl der Farbfreiheitsgrade) und lässt eine äquivalente Behandlung von Goldstonebosonen (Pionen) und Resonanzen (Rho- und Omegamesonen) zu. Als Renormierungsschema wird das für (bezüglich der starken Wechselwirkung) instabile Teilchen besonders geeignete complex-mass scheme als Erweiterung des extended on-mass-shell scheme verwendet, welches in Kombination mit dem BPHZ-Renormierungsverfahren (benannt nach Bogoliubov, Parasiuk, Hepp und Zimmermann) ein leistungsfähiges Konzept zur Berechnung von Quantenkorrekturen in dieser chiralen effektiven Feldtheorie darstellt. Sämtliche vorgenommenen Rechnungen schließen Terme der chiralen Ordnung vier sowie einfache Schleifen in Feynman-Diagrammen ein. Betrachtet werden unter anderem der Vektorformfaktor des Pions im zeitartigen Bereich, die reelle Compton-Streuung (beziehungsweise Photonenfusion) im neutralen und geladenen Kanal sowie die virtuelle Compton-Streuung, eingebettet in die Elektron-Positron-Annihilation. Zur Extraktion der Niederenergiekopplungskonstanten der Theorie wird letztendlich eine Reihe experimenteller Datensätze verschiedenartiger Observablen verwendet. Die hier entwickelten Methoden und Prozeduren - und insbesondere deren technische Implementierung - sind sehr allgemeiner Natur und können daher auch an weitere Problemstellungen aus diesem Gebiet der niederenergetischen Quantenchromodynamik angepasst werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In any terminological study, candidate term extraction is a very time-consuming task. Corpus analysis tools have automatized some processes allowing the detection of relevant data within the texts, facilitating term candidate selection as well. Nevertheless, these tools are (normally) not specific for terminology research; therefore, the units which are automatically extracted need manual evaluation. Over the last few years some software products have been specifically developed for automatic term extraction. They are based on corpus analysis, but use linguistic and statistical information to filter data more precisely. As a result, the time needed for manual evaluation is reduced. In this framework, we tried to understand if and how these new tools can really be an advantage. In order to develop our project, we simulated a terminology study: we chose a domain (i.e. legal framework for medicinal products for human use) and compiled a corpus from which we extracted terms and phraseologisms using AntConc, a corpus analysis tool. Afterwards, we compared our list with the lists extracted automatically from three different tools (TermoStat Web, TaaS e Sketch Engine) in order to evaluate their performance. In the first chapter we describe some principles relating to terminology and phraseology in language for special purposes and show the advantages offered by corpus linguistics. In the second chapter we illustrate some of the main concepts of the domain selected, as well as some of the main features of legal texts. In the third chapter we describe automatic term extraction and the main criteria to evaluate it; moreover, we introduce the term-extraction tools used for this project. In the fourth chapter we describe our research method and, in the fifth chapter, we show our results and draw some preliminary conclusions on the performance and usefulness of term-extraction tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite image classification involves designing and developing efficient image classifiers. With satellite image data and image analysis methods multiplying rapidly, selecting the right mix of data sources and data analysis approaches has become critical to the generation of quality land-use maps. In this study, a new postprocessing information fusion algorithm for the extraction and representation of land-use information based on high-resolution satellite imagery is presented. This approach can produce land-use maps with sharp interregional boundaries and homogeneous regions. The proposed approach is conducted in five steps. First, a GIS layer - ATKIS data - was used to generate two coarse homogeneous regions, i.e. urban and rural areas. Second, a thematic (class) map was generated by use of a hybrid spectral classifier combining Gaussian Maximum Likelihood algorithm (GML) and ISODATA classifier. Third, a probabilistic relaxation algorithm was performed on the thematic map, resulting in a smoothed thematic map. Fourth, edge detection and edge thinning techniques were used to generate a contour map with pixel-width interclass boundaries. Fifth, the contour map was superimposed on the thematic map by use of a region-growing algorithm with the contour map and the smoothed thematic map as two constraints. For the operation of the proposed method, a software package is developed using programming language C. This software package comprises the GML algorithm, a probabilistic relaxation algorithm, TBL edge detector, an edge thresholding algorithm, a fast parallel thinning algorithm, and a region-growing information fusion algorithm. The county of Landau of the State Rheinland-Pfalz, Germany was selected as a test site. The high-resolution IRS-1C imagery was used as the principal input data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Groundwater represents one of the most important resources of the world and it is essential to prevent its pollution and to consider remediation intervention in case of contamination. According to the scientific community the characterization and the management of the contaminated sites have to be performed in terms of contaminant fluxes and considering their spatial and temporal evolution. One of the most suitable approach to determine the spatial distribution of pollutant and to quantify contaminant fluxes in groundwater is using control panels. The determination of contaminant mass flux, requires measurement of contaminant concentration in the moving phase (water) and velocity/flux of the groundwater. In this Master Thesis a new solute flux mass measurement approach, based on an integrated control panel type methodology combined with the Finite Volume Point Dilution Method (FVPDM), for the monitoring of transient groundwater fluxes, is proposed. Moreover a new adsorption passive sampler, which allow to capture the variation of solute concentration with time, is designed. The present work contributes to the development of this approach on three key points. First, the ability of the FVPDM to monitor transient groundwater fluxes was verified during a step drawdown test at the experimental site of Hermalle Sous Argentau (Belgium). The results showed that this method can be used, with optimal results, to follow transient groundwater fluxes. Moreover, it resulted that performing FVPDM, in several piezometers, during a pumping test allows to determine the different flow rates and flow regimes that can occurs in the various parts of an aquifer. The second field test aiming to determine the representativity of a control panel for measuring mass flus in groundwater underlined that wrong evaluations of Darcy fluxes and discharge surfaces can determine an incorrect estimation of mass fluxes and that this technique has to be used with precaution. Thus, a detailed geological and hydrogeological characterization must be conducted, before applying this technique. Finally, the third outcome of this work concerned laboratory experiments. The test conducted on several type of adsorption material (Oasis HLB cartridge, TDS-ORGANOSORB 10 and TDS-ORGANOSORB 10-AA), in order to determine the optimum medium to dimension the passive sampler, highlighted the necessity to find a material with a reversible adsorption tendency to completely satisfy the request of the new passive sampling technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PM3 semiempirical quantum-mechanical method was found to systematically describe intermolecular hydrogen bonding in small polar molecules. PM3 shows charge transfer from the donor to acceptor molecules on the order of 0.02-0.06 units of charge when strong hydrogen bonds are formed. The PM3 method is predictive; calculated hydrogen bond energies with an absolute magnitude greater than 2 kcal mol-' suggest that the global minimum is a hydrogen bonded complex; absolute energies less than 2 kcal mol-' imply that other van der Waals complexes are more stable. The geometries of the PM3 hydrogen bonded complexes agree with high-resolution spectroscopic observations, gas electron diffraction data, and high-level ab initio calculations. The main limitations in the PM3 method are the underestimation of hydrogen bond lengths by 0.1-0.2 for some systems and the underestimation of reliable experimental hydrogen bond energies by approximately 1-2 kcal mol-l. The PM3 method predicts that ammonia is a good hydrogen bond acceptor and a poor hydrogen donor when interacting with neutral molecules. Electronegativity differences between F, N, and 0 predict that donor strength follows the order F > 0 > N and acceptor strength follows the order N > 0 > F. In the calculations presented in this article, the PM3 method mirrors these electronegativity differences, predicting the F-H- - -N bond to be the strongest and the N-H- - -F bond the weakest. It appears that the PM3 Hamiltonian is able to model hydrogen bonding because of the reduction of two-center repulsive forces brought about by the parameterization of the Gaussian core-core interactions. The ability of the PM3 method to model intermolecular hydrogen bonding means reasonably accurate quantum-mechanical calculations can be applied to small biologic systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Instrumental daily series of temperature are often affected by inhomogeneities. Several methods are available for their correction at monthly and annual scales, whereas few exist for daily data. Here, an improved version of the higher-order moments (HOM) method, the higher-order moments for autocorrelated data (HOMAD), is proposed. HOMAD addresses the main weaknesses of HOM, namely, data autocorrelation and the subjective choice of regression parameters. Simulated series are used for the comparison of both methodologies. The results highlight and reveal that HOMAD outperforms HOM for small samples. Additionally, three daily temperature time series from stations in the eastern Mediterranean are used to show the impact of homogenization procedures on trend estimation and the assessment of extremes. HOMAD provides an improved correction of daily temperature time series and further supports the use of corrected daily temperature time series prior to climate change assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projects for the developing world usually find themselves at the bottom of an engineer’s priority list. There is often very little engineering effort placed on creating new products for the poorest people in the world. This trend is beginning to change now as people begin to recognize the potential for these projects. Engineers are beginning to try and solve some of the direst issues in the developing world and many are having positive impacts. However, the conditions needed to support these projects can only be maintained in the short term. There is now a need for greater sustainability. Sustainability has a wide variety of definitions in both business and engineering. These concepts are analyzed and synthesized to develop a broad meaning of sustainability in the developing world. This primarily stems from the “triple bottom line” concept of economic, social, and environmental sustainability. Using this model and several international standards, this thesis develops a metric for guiding and evaluating the sustainability of engineering projects. The metric contains qualitative questions that investigate the sustainability of a project. It is used to assess several existing projects in order to determine flaws. Specifically, three projects seeking to deliver eyeglasses are analyzed for weaknesses to help define a new design approach for achieving better results. Using the metric as a guiding tool, teams designed two pieces of optometry equipment: one to cut lenses for eyeglasses and the other to diagnose refractive error, or prescription. These designs are created and prototyped in the developed and developing worlds in order to determine general feasibility. Although there is a recognized need for eventual design iterations, the whole project is evaluated using the developed metric and compared to the existing projects. Overall, the success demonstrates the improvements made to the long-term sustainability of the project resulting from the use of the sustainability metric.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research project is to study an innovative method for the stability assessment of structural steel systems, namely the Modified Direct Analysis Method (MDM). This method is intended to simplify an existing design method, the Direct Analysis Method (DM), by assuming a sophisticated second-order elastic structural analysis will be employed that can account for member and system instability, and thereby allow the design process to be reduced to confirming the capacity of member cross-sections. This last check can be easily completed by substituting an effective length of KL = 0 into existing member design equations. This simplification will be particularly useful for structural systems in which it is not clear how to define the member slenderness L/r when the laterally unbraced length L is not apparent, such as arches and the compression chord of an unbraced truss. To study the feasibility and accuracy of this new method, a set of 12 benchmark steel structural systems previously designed and analyzed by former Bucknell graduate student Jose Martinez-Garcia and a single column were modeled and analyzed using the nonlinear structural analysis software MASTAN2. A series of Matlab-based programs were prepared by the author to provide the code checking requirements for investigating the MDM. By comparing MDM and DM results against the more advanced distributed plasticity analysis results, it is concluded that the stability of structural systems can be adequately assessed in most cases using MDM, and that MDM often appears to be a more accurate but less conservative method in assessing stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The socialisation of mentally handicapped people is a long-term process during which the disabled person learns new habits and abilities step by step through education and training. Anxiety and neuroses due to an inadequate social environment can place obstacles in the path of the disabled person's integration into society. A method of regulating the psycho-physiological condition of mentally handicapped people (MRPC) was developed in order to reduce anxiety and neuropsychological tension and to establish positive social attitudes. Both verbal and non-verbal means of manipulating the psycho-physiological condition were used and experimental and control groups were formed from among the clients of Israelian's institute. The experimental groups applied the new method for six months, leading to a significant shift in the response of the clients involved. Expressed anxiety and defensive responses to mental tasks were transformed into orienting responses after 30 psycho-regulative exercises. Cognitive functions such as attention and memory also improved significantly. EEG examinations of the actual process of psycho-regulation revealed a tendency towards a change of brain activity by increasing the fast pulse frequency values in the alpha zones. Israelian concludes that the application of the MRPC creates better functional conditions for the socialisation of mentally handicapped people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term follow up of patients with total hip arthroplasty (THA) revealed a marked deterioration of walking capacities in Charnley class B after postoperative year 4. We hypothesized that a specific group of patients, namely those with unilateral hip arthroplasty and an untreated but affected contralateral hip was responsible for this observation. Therefore, we conducted a study taking into consideration the two subclasses that make up Charnley class B: patients with unilateral THA and contralateral hip disease and patients with bilateral THA. A sample of 15,160 patients with 35,773 follow ups that were prospectively collected over 10 years was evaluated. The sample was categorized into four classes according to a new modified Charnley classification. Annual analyses of the proportion of patients with ambulation longer than 60 min were conducted. The traditionally labeled Charnley class B consists of two very different patient groups with respect to their walking capacities. Those with unilateral THA and contralateral hip disease have underaverage walking capacities and a deterioration of ambulation beginning 3 to 4 years after surgery. Those with bilateral THA have stable overaverage walking capacities similar to Charnley class A. An extension of the traditional Charnley classification is proposed, taking into account the two different patient groups in Charnley class B. The new fourth Charnley class consists of patients with bilateral THA and was labeled BB in order to express the presence of two artificial hip joints and to preserve the traditional classification A through C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ligament balancing in total knee arthroplasty may have an important influence on joint stability and prosthesis lifetime. In order to provide quantitative information and assistance during ligament balancing, a device that intraoperatively measures knee joint forces and moments was developed. Its performance and surgical advantages were evaluated on six cadaver specimens mounted on a knee joint loading apparatus allowing unconstrained knee motion as well as compression and varus-valgus loading. Four different experiments were performed on each specimen. (1) Knee joints were axially loaded. Comparison between applied and measured compressive forces demonstrated the accuracy and reliability of in situ measurements (1.8N). (2) Assessment of knee stability based on condyle contact forces or varus-valgus moments were compared to the current surgical method (difference of varus-valgus loads causing condyle lift-off). The force-based approach was equivalent to the surgical method while the moment-based, which is considered optimal, showed a tendency of lateral imbalance. (3) To estimate the importance of keeping the patella in its anatomical position during imbalance assessment, the effect of patellar eversion on the mediolateral distribution of tibiofemoral contact forces was measured. One fourth of the contact force induced by the patellar load was shifted to the lateral compartment. (4) The effect of minor and major medial collateral ligament releases was biomechanically quantified. On average, the medial contact force was reduced by 20% and 46%, respectively. Large variation among specimens reflected the difficulty of ligament release and the need for intraoperative force monitoring. This series of experiments thus demonstrated the device's potential to improve ligament balancing and survivorship of total knee arthroplasty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Granger causality (GC) is a statistical technique used to estimate temporal associations in multivariate time series. Many applications and extensions of GC have been proposed since its formulation by Granger in 1969. Here we control for potentially mediating or confounding associations between time series in the context of event-related electrocorticographic (ECoG) time series. A pruning approach to remove spurious connections and simultaneously reduce the required number of estimations to fit the effective connectivity graph is proposed. Additionally, we consider the potential of adjusted GC applied to independent components as a method to explore temporal relationships between underlying source signals. Both approaches overcome limitations encountered when estimating many parameters in multivariate time-series data, an increasingly common predicament in today's brain mapping studies.