947 resultados para Which-way experiments


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Física

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this work was to determine the efficiency of the Papadakis method on the quality evaluation of experiments with multiple-harvest oleraceous crops, and on the estimate of the covariate and the ideal plot size. Data from nine uniformity trials (five with bean pod, two with zucchini, and two with sweet pepper) and from one experiment with treatments (with sweet pepper) were used. Through the uniformity trials, the best way to calculate the covariate was defined and the optimal plot size was calculated. In the experiment with treatments, analyses of variance and covariance were performed, in which the covariate was calculated by the Papadakis method, and experimental precision was evaluated based on four statistics. The use of analysis of covariance with the covariate obtained by the Papadakis method increases the quality of experiments with multiple-harvest oleraceous crops and allows the use of smaller plot sizes. The best covariate is the one that considers a neighboring plot of each side of the reference plot.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper explores transparency in the decision-making of the European Central Bank (ECB). According to ECB´s definition, transparency means that the central bank provides the general public with all relevant information on its strategy, assessments and policy decisions as well as its procedures in an open, clear and timely manner. In this paper, however, the interpretation of transparency is somewhat broader: Information is freely available and directly accessible to those who will be affected by the decisions. Moreover, the individuals shall be able to master this material. ECB´s negative attitude towards publication of documents has demonstrated central bank´s reluctance to strive towards more extensive transparency. By virtue of the definition adopted by the ECB the bank itself is responsible for determining what is considered as relevant information. On the grounds of EU treaties, this paper assesses ECB`s accountability concentrating especially on transparency by employing principal-agent theory and constitutional approach. Traditionally, the definite mandate and the tenet of central bank independence have been used to justify the limited accountability. The de facto competence of the ECB has, however, considerably expanded as the central bank has decisively resorted to non-standard measures in order to combat the economic turbulences facing Europe. It is alleged that non-standard monetary policy constitutes a grey zone occasionally resembling economic policy or fiscal policy. Notwithstanding, the European Court of Justice has repeatedly approved these measures. This dynamic interpretation of the treaties seems to allow temporarily exceptions from the central bank´s primary objective during extraordinary times. Regardless, the paper suggests that the accountability nexus defined in the treaties is not sufficient in order to guarantee the accountability of the ECB after the adoption of the new, more active role. Enhanced transparency would help the ECB to maintain its credibility. Investing in the quality of monetary dialogue between the Parliament and the ECB appears to constitute the most adequate and practicable method to accomplish this intention. As a result of upgraded transparency the legitimacy of the central bank would not solely rest on its policy outputs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A review is given of the mechanics of cutting, ranging from the slicing of thin floppy offcuts (where there is negligible elasticity and no permanent deformation of the offcut) to the machining of ductile metals (where there is severe permanent distortion of the offcut/chip). Materials scientists employ the former conditions to determine the fracture toughness of ‘soft’ solids such as biological materials and foodstuffs. In contrast, traditional analyses of metalcutting are based on plasticity and friction only, and do not incorporate toughness. The machining theories are inadequate in a number of ways but a recent paper has shown that when ductile work of fracture is included many, if not all, of the shortcomings are removed. Support for the new analysis is given by examination of FEM simulations of metalcutting which reveal that a ‘separation criterion’ has to be employed at the tool tip. Some consideration shows that the separation criteria are versions of void-initiation-growth-and-coalescence models employed in ductile fracture mechanics. The new analysis shows that cutting forces for ductile materials depend upon the fracture toughness as well as plasticity and friction, and reveals a simple way of determining both toughness and flow stress from cutting experiments. Examples are given for a wide range of materials including metals, polymers and wood, and comparison is made with the same properties independently determined using conventional testpieces. Because cutting can be steady state, a new way is presented for simultaneously measuring toughness and flow stress at controlled speeds and strain rates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A method to solve a quasi-geostrophic two-layer model including the variation of static stability is presented. The divergent part of the wind is incorporated by means of an iterative procedure. The procedure is rather fast and the time of computation is only 60–70% longer than for the usual two-layer model. The method of solution is justified by the conservation of the difference between the gross static stability and the kinetic energy. To eliminate the side-boundary conditions the experiments have been performed on a zonal channel model. The investigation falls mainly into three parts: The first part (section 5) contains a discussion of the significance of some physically inconsistent approximations. It is shown that physical inconsistencies are rather serious and for these inconsistent models which were studied the total kinetic energy increased faster than the gross static stability. In the next part (section 6) we are studying the effect of a Jacobian difference operator which conserves the total kinetic energy. The use of this operator in two-layer models will give a slight improvement but probably does not have any practical use in short periodic forecasts. It is also shown that the energy-conservative operator will change the wave-speed in an erroneous way if the wave-number or the grid-length is large in the meridional direction. In the final part (section 7) we investigate the behaviour of baroclinic waves for some different initial states and for two energy-consistent models, one with constant and one with variable static stability. According to the linear theory the waves adjust rather rapidly in such a way that the temperature wave will lag behind the pressure wave independent of the initial configuration. Thus, both models give rise to a baroclinic development even if the initial state is quasi-barotropic. The effect of the variation of static stability is very small, qualitative differences in the development are only observed during the first 12 hours. For an amplifying wave we will get a stabilization over the troughs and an instabilization over the ridges.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although the aim of empirical software engineering is to provide evidence for selecting the appropriate technology, it appears that there is a lack of recognition of this work in industry. Results from empirical research only rarely seem to find their way to company decision makers. If information relevant for software managers is provided in reports on experiments, such reports can be considered as a source of information for them when they are faced with making decisions about the selection of software engineering technologies. To bridge this communication gap between researchers and professionals, we propose characterizing the information needs of software managers in order to show empirical software engineering researchers which information is relevant for decision-making and thus enable them to make this information available. We empirically investigated decision makers? information needs to identify which information they need to judge the appropriateness and impact of a software technology. We empirically developed a model that characterizes these needs. To ensure that researchers provide relevant information when reporting results from experiments, we extended existing reporting guidelines accordingly.We performed an experiment to evaluate our model with regard to its effectiveness. Software managers who read an experiment report according to the proposed model judged the technology?s appropriateness significantly better than those reading a report about the same experiment that did not explicitly address their information needs. Our research shows that information regarding a technology, the context in which it is supposed to work, and most importantly, the impact of this technology on development costs and schedule as well as on product quality is crucial for decision makers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although the aim of empirical software engineering is to provide evidence for selecting the appropriate technology, it appears that there is a lack of recognition of this work in industry. Results from empirical research only rarely seem to find their way to company decision makers. If information relevant for software managers is provided in reports on experiments, such reports can be considered as a source of information for them when they are faced with making decisions about the selection of software engineering technologies. To bridge this communication gap between researchers and professionals, we propose characterizing the information needs of software managers in order to show empirical software engineering researchers which information is relevant for decision-making and thus enable them to make this information available. We empirically investigated decision makers? information needs to identify which information they need to judge the appropriateness and impact of a software technology. We empirically developed a model that characterizes these needs. To ensure that researchers provide relevant information when reporting results from experiments, we extended existing reporting guidelines accordingly. We performed an experiment to evaluate our model with regard to its effectiveness. Software managers who read an experiment report according to the proposed model judged the technology?s appropriateness significantly better than those reading a report about the same experiment that did not explicitly address their information needs. Our research shows that information regarding a technology, the context in which it is supposed to work, and most importantly, the impact of this technology on development costs and schedule as well as on product quality is crucial for decision makers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Context. This thesis is framed in experimental software engineering. More concretely, it addresses the problems arisen when assessing process conformance in test-driven development experiments conducted by UPM's Experimental Software Engineering group. Process conformance was studied using the Eclipse's plug-in tool Besouro. It has been observed that Besouro does not work correctly in some circumstances. It creates doubts about the correction of the existing experimental data which render it useless. Aim. The main objective of this work is the identification and correction of Besouro's faults. A secondary goal is fixing the datasets already obtained in past experiments to the maximum possible extent. This way, existing experimental results could be used with confidence. Method. (1) Testing Besouro using different sequences of events (creation methods, assertions etc..) to identify the underlying faults. (2) Fix the code and (3) fix the datasets using code specially created for this purpose. Results. (1) We confirmed the existence of several fault in Besouro's code that affected to Test-First and Test-Last episode identification. These faults caused the incorrect identification of 20% of episodes. (2) We were able to fix Besouro's code. (3) The correction of existing datasets was possible, subjected to some restrictions (such us the impossibility of tracing code size increase to programming time. Conclusion. The results of past experiments dependent upon Besouro's data could no be trustable. We have the suspicion that more faults remain in Besouro's code, whose identification requires further analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The conductance across an atomically narrow metallic contact can be measured by using scanning tunneling microscopy. In certain situations, a jump in the conductance is observed right at the point of contact between the tip and the surface, which is known as “jump to contact” (JC). Such behavior provides a way to explore, at a fundamental level, how bonding between metallic atoms occurs dynamically. This phenomenon depends not only on the type of metal but also on the geometry of the two electrodes. For example, while some authors always find JC when approaching two atomically sharp tips of Cu, others find that a smooth transition occurs when approaching a Cu tip to an adatom on a flat surface of Cu. In an attempt to show that all these results are consistent, we make use of atomistic simulations; in particular, classical molecular dynamics together with density functional theory transport calculations to explore a number of possible scenarios. Simulations are performed for two different materials: Cu and Au in a [100] crystal orientation and at a temperature of 4.2 K. These simulations allow us to study the contribution of short- and long-range interactions to the process of bonding between metallic atoms, as well as to compare directly with experimental measurements of conductance, giving a plausible explanation for the different experimental observations. Moreover, we show a correlation between the cohesive energy of the metal, its Young's modulus, and the frequency of occurrence of a jump to contact.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Energy harvesting devices are widely discussed as an alternative power source for todays active implantable medical devices. Repeated battery replacement procedures can be avoided by extending the implants life span, which is the goal of energy harvesting concepts. This reduces the risk of complications for the patient and may even reduce device size. The continuous and powerful contractions of a human heart ideally qualify as a battery substitute. In particular, devices in close proximity to the heart such as pacemakers, defibrillators or bio signal (ECG) recorders would benefit from this alternative energy source. The clockwork of an automatic wristwatch was used to transform the hearts kinetic energy into electrical energy. In order to qualify as a continuous energy supply for the consuming device, the mechanism needs to demonstrate its harvesting capability under various conditions. Several in-vivo recorded heart motions were used as input of a mathematical model to optimize the clockworks original conversion efficiency with respect to myocardial contractions. The resulting design was implemented and tested during in-vitro and in-vivo experiments, which demonstrated the superior sensitivity of the new design for all tested heart motions.