922 resultados para methods and measurement
Resumo:
This article analyzes the role that has been attributed to grammar throughout the history of foreign language teaching, with special emphasis on methods and approaches of the twentieth century. In order to support our argument, we discuss the notion of grammar by proposing a conceptual continuum that includes the main meanings of the term which are relevant to our research. We address as well the issue of "pedagogical grammar" and consider the position of grammar in the different approaches of the "era of the methods" and the current "post-method condition" in the field of language teaching and learning. The findings presented at the end of the text consist of recognizing the central role that grammar has played throughout the history of the methods and approaches, where grammar has always been present by the definition of the contents' progression. The rationale that we propose for this is the recognition of the fact that the dissociation between what is said and how it is said can not be more than theoretical and, thus, artificial.
Resumo:
This study aimed to verify the influence of the transport in open or closed compartments (0 h), followed by two resting periods (1 and 3 h) for the slaughter process on the levels of cortisol as a indicative of stress level. At the slaughterhouse, blood samples were taken from 86 lambs after the transport and before slaughter for plasma cortisol analysis. The method of transport influenced in the cortisol concentration (0 h; P < 0.01). The animals transported in the closed compartment had a lower level (28.97 ng ml(-1)) than the animals transported in the open compartment (35.49 ng ml(-1)). After the resting period in the slaughterhouse. there was a decline in the plasmatic cortisol concentration, with the animals subjected to 3 h of rest presenting the lower average cortisol value (24.14 ng ml(-1); P < 0.05) than animals subjected to 1 h of rest (29.95 ng ml(-1)). It can be inferred that the lambs that remained 3 h in standby before slaughter had more time to recover from the stress of the transportation than those that waited just 1 h. Visual access to the external environment during the transport of the lambs is a stressful factor changing the level of plasmatic cortisol, and the resting period before slaughter was effective in lowering stress, reducing the plasmatic cortisol in the lambs. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
Membrane proteins are a large and important class of proteins. They are responsible for several of the key functions in a living cell, e.g. transport of nutrients and ions, cell-cell signaling, and cell-cell adhesion. Despite their importance it has not been possible to study their structure and organization in much detail because of the difficulty to obtain 3D structures. In this thesis theoretical studies of membrane protein sequences and structures have been carried out by analyzing existing experimental data. The data comes from several sources including sequence databases, genome sequencing projects, and 3D structures. Prediction of the membrane spanning regions by hydrophobicity analysis is a key technique used in several of the studies. A novel method for this is also presented and compared to other methods. The primary questions addressed in the thesis are: What properties are common to all membrane proteins? What is the overall architecture of a membrane protein? What properties govern the integration into the membrane? How many membrane proteins are there and how are they distributed in different organisms? Several of the findings have now been backed up by experiments. An analysis of the large family of G-protein coupled receptors pinpoints differences in length and amino acid composition of loops between proteins with and without a signal peptide and also differences between extra- and intracellular loops. Known 3D structures of membrane proteins have been studied in terms of hydrophobicity, distribution of secondary structure and amino acid types, position specific residue variability, and differences between loops and membrane spanning regions. An analysis of several fully and partially sequenced genomes from eukaryotes, prokaryotes, and archaea has been carried out. Several differences in the membrane protein content between organisms were found, the most important being the total number of membrane proteins and the distribution of membrane proteins with a given number of transmembrane segments. Of the properties that were found to be similar in all organisms, the most obvious is the bias in the distribution of positive charges between the extra- and intracellular loops. Finally, an analysis of homologues to membrane proteins with known topology uncovered two related, multi-spanning proteins with opposite predicted orientations. The predicted topologies were verified experimentally, providing a first example of "divergent topology evolution".
Resumo:
[EN]The age and growth of the sand sole Pegusa lascaris from the Canarian Archipelago were studied from 2107 fish collected between January 2005 and December 2007. To find an appropriate method for age determination, sagittal otoliths were observed by surface-reading and frontal section and the results were compared. The two methods did not differ significantly in estimated age but the surface-reading method is superior in terms of cost and time efficiency. The sand sole has a moderate life span, with ages up to 10 years recorded. Individuals grow quickly in their first two years, attaining approximately 48% of their maximum standard length; after the second year, their growth rate drops rapidly as energy is diverted to reproduction. Males and females show dimorphism in growth, with females reaching a slightly greater length and age than males. Von Bertalanffy, seasonalized von Bertalanfy, Gompertz, and Schnute growth models were fitted to length-at-age data. Akaike weights for the seasonalized von Bertalanffy growth model indicated that the probability of choosing the correct model from the group of models used was >0.999 for males and females. The seasonalized von Bertalanffy growth parameters estimated were: L? = 309 mm standard length, k = 0.166 yr?1, t0 = ?1.88 yr, C = 0.347, and ts = 0.578 for males; and L? = 318 mm standard length, k = 0.164 yr?1, t0 = ?1.653 yr, C = 0.820, and ts = 0.691 for females. Fish standard length and otolith radius are closely correlated (R2 = 0.902). The relation between standard length and otolith radius is described by a power function (a = 85.11, v = 0.906)
Resumo:
The popularity of herbal products, especially plant food supplements (PFS) and herbal medicine is on the rise in Europe and other parts of the world, with increased use in the general population as well as among specific subgroups encompassing children, women or those suffering from diseases such as cancer. The aim of this paper is to examine the PFS market structures in European Community (EC) Member States as well as to examine issues addressing methodologies and consumption data relating to PFS use in Europe. A revision of recent reports on market data, trends and main distribution channels, in addition an example of the consumption of PFS in Spain, is presented. An overview of the methods and administration techniques used...
Resumo:
The purpose of the work is: define and calculate a factor of collapse related to traditional method to design sheet pile walls. Furthermore, we tried to find the parameters that most influence a finite element model representative of this problem. The text is structured in this way: from chapter 1 to 5, we analyzed a series of arguments which are usefull to understanding the problem, while the considerations mainly related to the purpose of the text are reported in the chapters from 6 to 10. In the first part of the document the following arguments are shown: what is a sheet pile wall, what are the codes to be followed for the design of these structures and what they say, how can be formulated a mathematical model of the soil, some fundamentals of finite element analysis, and finally, what are the traditional methods that support the design of sheet pile walls. In the chapter 6 we performed a parametric analysis, giving an answer to the second part of the purpose of the work. Comparing the results from a laboratory test for a cantilever sheet pile wall in a sandy soil, with those provided by a finite element model of the same problem, we concluded that:in modelling a sandy soil we should pay attention to the value of cohesion that we insert in the model (some programs, like Abaqus, don’t accept a null value for this parameter), friction angle and elastic modulus of the soil, they influence significantly the behavior of the system (structure-soil), others parameters, like the dilatancy angle or the Poisson’s ratio, they don’t seem influence it. The logical path that we followed in the second part of the text is reported here. We analyzed two different structures, the first is able to support an excavation of 4 m, while the second an excavation of 7 m. Both structures are first designed by using the traditional method, then these structures are implemented in a finite element program (Abaqus), and they are pushed to collapse by decreasing the friction angle of the soil. The factor of collapse is the ratio between tangents of the initial friction angle and of the friction angle at collapse. At the end, we performed a more detailed analysis of the first structure, observing that, the value of the factor of collapse is influenced by a wide range of parameters including: the value of the coefficients assumed in the traditional method and by the relative stiffness of the structure-soil system. In the majority of cases, we found that the value of the factor of collapse is between and 1.25 and 2. With some considerations, reported in the text, we can compare the values so far found, with the value of the safety factor proposed by the code (linked to the friction angle of the soil).
Resumo:
Some fundamental biological processes such as embryonic development have been preserved during evolution and are common to species belonging to different phylogenetic positions, but are nowadays largely unknown. The understanding of cell morphodynamics leading to the formation of organized spatial distribution of cells such as tissues and organs can be achieved through the reconstruction of cells shape and position during the development of a live animal embryo. We design in this work a chain of image processing methods to automatically segment and track cells nuclei and membranes during the development of a zebrafish embryo, which has been largely validates as model organism to understand vertebrate development, gene function and healingrepair mechanisms in vertebrates. The embryo is previously labeled through the ubiquitous expression of fluorescent proteins addressed to cells nuclei and membranes, and temporal sequences of volumetric images are acquired with laser scanning microscopy. Cells position is detected by processing nuclei images either through the generalized form of the Hough transform or identifying nuclei position with local maxima after a smoothing preprocessing step. Membranes and nuclei shapes are reconstructed by using PDEs based variational techniques such as the Subjective Surfaces and the Chan Vese method. Cells tracking is performed by combining informations previously detected on cells shape and position with biological regularization constraints. Our results are manually validated and reconstruct the formation of zebrafish brain at 7-8 somite stage with all the cells tracked starting from late sphere stage with less than 2% error for at least 6 hours. Our reconstruction opens the way to a systematic investigation of cellular behaviors, of clonal origin and clonal complexity of brain organs, as well as the contribution of cell proliferation modes and cell movements to the formation of local patterns and morphogenetic fields.
Resumo:
Perfluoroalkylated substances are a group of chemicals that have been largely employed during the last 60 years in several applications, widely spreading and accumulating in the environment due to their extreme resistance to degradation. As a consequence, they have been found also in various types of food as well as in drinking water, proving that they can easily reach humans through the diet. The available information concerning their adverse effects on health has recently increased the interest towards these contaminants and highlighted the importance of investigating all the potential sources of human exposure, among which diet was proved to be the most relevant. This need has been underlined by the European Union through Recommendation 2010/161/EU: in this document, Member States were called to monitor their presence of in food, producing accurate estimations of human exposure. The purpose of the research presented in this thesis, which is the result of a partnership between an Italian and a French laboratory, was to develop reliable tools for the analysis of these pollutants in food, to be used for generating data on potentially contaminated matrices. An efficient method based on liquid chromatography-mass spectrometry for the detection of 16 different perfluorinated compounds in milk has been validated in accordance with current European regulation guidelines (2002/657/EC) and is currently under evaluation for ISO 17025 accreditation. The proposed technique was applied to cow, powder and human breast milk samples from Italy and France to produce a preliminary monitoring on the presence of these contaminants. In accordance with the above mentioned European Recommendation, this project led also to the development of a promising technique for the quantification of some precursors of these substances in fish. This method showed extremely satisfying performances in terms of linearity and limits of detection, and will be useful for future surveys.
Resumo:
Il fenomeno dello scattering diffuso è stato oggetto di numerosi studi nell’arco degli ultimi anni, questo grazie alla sua rilevanza nell’ambito della propagazione elettromagnetica così come in molti altri campi di applicazione (remote sensing, ottica, fisica, etc.), ma la compresione completa di questo effetto è lungi dall’essere raggiunta. Infatti la complessità nello studio e nella caratterizzazione della diffusione deriva dalla miriade di casistiche ed effetti che si possono incontrare in un ambiente di propagazione reale, lasciando intuire la necessità di trattarne probabilisticamente il relativo contributo. Da qui nasce l’esigenza di avere applicazioni efficienti dal punto di vista ingegneristico che coniughino la definizione rigorosa del fenomeno e la conseguente semplificazione per fini pratici. In tale visione possiamo descrivere lo scattering diffuso come la sovrapposizione di tutti quegli effetti che si scostano dalle classiche leggi dell’ottica geometrica (riflessione, rifrazione e diffrazione) che generano contributi del campo anche in punti dello spazio e direzioni in cui teoricamente, per oggetti lisci ed omogenei, non dovrebbe esserci alcun apporto. Dunque l’effetto principale, nel caso di ambiente di propagazione reale, è la diversa distribuzione spaziale del campo rispetto al caso teorico di superficie liscia ed omogenea in congiunzione ad effetti di depolarizzazione e redistribuzione di energia nel bilancio di potenza. Perciò la complessità del fenomeno è evidente e l’obiettivo di tale elaborato è di proporre nuovi risultati che permettano di meglio descrivere lo scattering diffuso ed individuare altresì le tematiche sulle quali concentrare l’attenzione nei lavori futuri. In principio è stato quindi effettuato uno studio bibliografico così da identificare i modelli e le teorie esistenti individuando i punti sui quali riflettere maggiormente; nel contempo si sono analizzate le metodologie di caratterizzazione della permittività elettrica complessa dei materiali, questo per valutare la possibilità di ricavare i parametri da utilizzare nelle simulazioni utilizzando il medesimo setup di misura ideato per lo studio della diffusione. Successivamente si è realizzato un setup di simulazione grazie ad un software di calcolo elettromagnetico (basato sul metodo delle differenze finite nel dominio del tempo) grazie al quale è stato possibile analizzare la dispersione tridimensionale dovuta alle irregolarità del materiale. Infine è stata condotta una campagna di misure in camera anecoica con un banco sperimentale realizzato ad-hoc per effettuare una caratterizzazione del fenomeno di scattering in banda larga.
Resumo:
This thesis studies molecular dynamics simulations on two levels of resolution: the detailed level of atomistic simulations, where the motion of explicit atoms in a many-particle system is considered, and the coarse-grained level, where the motion of superatoms composed of up to 10 atoms is modeled. While atomistic models are capable of describing material specific effects on small scales, the time and length scales they can cover are limited due to their computational costs. Polymer systems are typically characterized by effects on a broad range of length and time scales. Therefore it is often impossible to atomistically simulate processes, which determine macroscopic properties in polymer systems. Coarse-grained (CG) simulations extend the range of accessible time and length scales by three to four orders of magnitude. However, no standardized coarse-graining procedure has been established yet. Following the ideas of structure-based coarse-graining, a coarse-grained model for polystyrene is presented. Structure-based methods parameterize CG models to reproduce static properties of atomistic melts such as radial distribution functions between superatoms or other probability distributions for coarse-grained degrees of freedom. Two enhancements of the coarse-graining methodology are suggested. Correlations between local degrees of freedom are implicitly taken into account by additional potentials acting between neighboring superatoms in the polymer chain. This improves the reproduction of local chain conformations and allows the study of different tacticities of polystyrene. It also gives better control of the chain stiffness, which agrees perfectly with the atomistic model, and leads to a reproduction of experimental results for overall chain dimensions, such as the characteristic ratio, for all different tacticities. The second new aspect is the computationally cheap development of nonbonded CG potentials based on the sampling of pairs of oligomers in vacuum. Static properties of polymer melts are obtained as predictions of the CG model in contrast to other structure-based CG models, which are iteratively refined to reproduce reference melt structures. The dynamics of simulations at the two levels of resolution are compared. The time scales of dynamical processes in atomistic and coarse-grained simulations can be connected by a time scaling factor, which depends on several specific system properties as molecular weight, density, temperature, and other components in mixtures. In this thesis the influence of molecular weight in systems of oligomers and the situation in two-component mixtures is studied. For a system of small additives in a melt of long polymer chains the temperature dependence of the additive diffusion is predicted and compared to experiments.
Resumo:
The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.
Resumo:
Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).
Resumo:
The Standard Model of elementary particle physics was developed to describe the fundamental particles which constitute matter and the interactions between them. The Large Hadron Collider (LHC) at CERN in Geneva was built to solve some of the remaining open questions in the Standard Model and to explore physics beyond it, by colliding two proton beams at world-record centre-of-mass energies. The ATLAS experiment is designed to reconstruct particles and their decay products originating from these collisions. The precise reconstruction of particle trajectories plays an important role in the identification of particle jets which originate from bottom quarks (b-tagging). This thesis describes the step-wise commissioning of the ATLAS track reconstruction and b-tagging software and one of the first measurements of the b-jet production cross section in pp collisions at sqrt(s)=7 TeV with the ATLAS detector. The performance of the track reconstruction software was studied in great detail, first using data from cosmic ray showers and then collisions at sqrt(s)=900 GeV and 7 TeV. The good understanding of the track reconstruction software allowed a very early deployment of the b-tagging algorithms. First studies of these algorithms and the measurement of the b-tagging efficiency in the data are presented. They agree well with predictions from Monte Carlo simulations. The b-jet production cross section was measured with the 2010 dataset recorded by the ATLAS detector, employing muons in jets to estimate the fraction of b-jets. The measurement is in good agreement with the Standard Model predictions.
Resumo:
Small molecules affecting biological processes in plants are widely used in agricultural practice as herbicides or plant growth regulators and in basic plant sciences as probes to study the physiology of plants. Most of the compounds were identified in large screens by the agrochemical industry, as phytoactive natural products and more recently, novel phytoactive compounds originated from academic research by chemical screens performed to induce specific phenotypes of interest. The aim of the present PhD thesis is to evaluate different approaches used for the identification of the primary mode of action (MoA) of a phytoactive compound. Based on the methodologies used for MoA identification, three approaches are discerned: a phenotyping approach, an approach based on a genetic screen and a biochemical screening approach.rnFour scientific publications resulting from my work are presented as examples of how a phenotyping approach can successfully be applied to describe the plant MoA of different compounds in detail.rnI. A subgroup of cyanoacrylates has been discovered as plant growth inhibitors. A set of bioassays indicated a specific effect on cell division. Cytological investigations of the cell division process in plant cell cultures, studies of microtubule assembly with green fluorescent protein marker lines in vivo and cross resistant studies with Eleusine indica plants harbouring a mutation in alpha-tubulin, led to the description of alpha-tubulin as a target site of cyanoacrylates (Tresch et al., 2005).rnII. The MoA of the herbicide flamprop-m-methyl was not known so far. The studies described in Tresch et al. (2008) indicate a primary effect on cell division. Detailed studies unravelled a specific effect on mitotic microtubule figures, causing a block in cell division. In contrast to other inhibitors of microtubule rearrangement such as dinitroanilines, flamprop-m-methyl did not influence microtubule assembly in vitro. An influence of flamprop-m-methyl on a target within the cytoskeleton signalling network could be proposed (Tresch et al., 2008).rnIII. The herbicide endothall is a protein phosphatase inhibitor structurally related to the natural product cantharidin. Bioassay studies indicated a dominant effect on dark-growing cells that was unrelated to effects observed in the light. Cytological characterisation of the microtubule cytoskeleton in corn tissue and heterotrophic tobacco cells showed a specific effect of endothall on mitotic spindle formation and ultrastructure of the nucleus in combination with a decrease of the proliferation index. The observed effects are similar to those of other protein phosphatase inhibitors such as cantharidin and the structurally different okadaic acid. Additionally, the observed effects show similarities to knock-out lines of the TON1 pathway, a protein phosphatase-regulated signalling pathway. The data presented in Tresch et al. (2011) associate endothall’s known in vitro inhibition of protein phosphatases with in vivo-effects and suggest an interaction between endothall and the TON1 pathway.rnIV. Mefluidide as a plant growth regulator induces growth retardation and a specific phenotype indicating an inhibition of fatty acid biosynthesis. A test of the cuticle functionality suggested a defect in the biosynthesis of very-long-chain fatty acids (VLCFA) or waxes. Metabolic profiling studies showed similarities with different groups of VLCFA synthesis inhibitors. Detailed analyses of VLCFA composition in tissues of duckweed (Lemna paucicostata) indicated a specific inhibition of the known herbicide target 3 ketoacyl-CoA synthase (KCS). Inhibitor studies using a yeast expression system established for plant KCS proteins verified the potency of mefluidide as an inhibitor of plant KCS enzymes. It could be shown that the strength of inhibition varied for different KCS homologues. The Arabidopsis Cer6 protein, which induces a plant growth phenotype similar to mefluidide when knocked out, was one of the most sensitive KCS enzymes (Tresch et al., 2012).rnThe findings of my own work were combined with other publications reporting a successful identification of the MoA and primary target proteins of different compounds or compound classes.rnA revised three-tier approach for the MoA identification of phytoactive compounds is proposed. The approach consists of a 1st level aiming to address compound stability, uniformity of effects in different species, general cytotoxicity and the effect on common processes like transcription and translation. Based on these findings advanced studies can be defined to start the 2nd level of MoA characterisation, either with further phenotypic characterisation, starting a genetic screen or establishing a biochemical screen. At the 3rd level, enzyme assays or protein affinity studies should show the activity of the compound on the hypothesized target and should associate the in vitro effects with the in vivo profile of the compound.
Towards the 3D attenuation imaging of active volcanoes: methods and tests on real and simulated data
Resumo:
The purpose of my PhD thesis has been to face the issue of retrieving a three dimensional attenuation model in volcanic areas. To this purpose, I first elaborated a robust strategy for the analysis of seismic data. This was done by performing several synthetic tests to assess the applicability of spectral ratio method to our purposes. The results of the tests allowed us to conclude that: 1) spectral ratio method gives reliable differential attenuation (dt*) measurements in smooth velocity models; 2) short signal time window has to be chosen to perform spectral analysis; 3) the frequency range over which to compute spectral ratios greatly affects dt* measurements. Furthermore, a refined approach for the application of spectral ratio method has been developed and tested. Through this procedure, the effects caused by heterogeneities of propagation medium on the seismic signals may be removed. The tested data analysis technique was applied to the real active seismic SERAPIS database. It provided a dataset of dt* measurements which was used to obtain a three dimensional attenuation model of the shallowest part of Campi Flegrei caldera. Then, a linearized, iterative, damped attenuation tomography technique has been tested and applied to the selected dataset. The tomography, with a resolution of 0.5 km in the horizontal directions and 0.25 km in the vertical direction, allowed to image important features in the off-shore part of Campi Flegrei caldera. High QP bodies are immersed in a high attenuation body (Qp=30). The latter is well correlated with low Vp and high Vp/Vs values and it is interpreted as a saturated marine and volcanic sediments layer. High Qp anomalies, instead, are interpreted as the effects either of cooled lava bodies or of a CO2 reservoir. A pseudo-circular high Qp anomaly was detected and interpreted as the buried rim of NYT caldera.