963 resultados para Palaeomagnetism Applied to Geologic Processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article reviews the concept of Lamarckian inheritance and the use of the term epigenetics in the field of animal genetics. Epigenetics was first coined by Conrad Hal Waddington (1905–1975), who derived the term from the Aristotelian word epigenesis. There exists some controversy around the word epigenetics and its broad definition. It includes any modification of the expression of genes due to factors other than mutation in the DNA sequence. This involves DNA methylation, post-translational modification of histones, but also linked to regulation of gene expression by non-coding RNAs, genome instabilities or any other force that could modify a phenotype. There is little evidence of the existence of transgenerational epigenetic inheritance in mammals, which may commonly be confounded with environmental forces acting simultaneously on an individual, her developing fetus and the germ cell lines of the latter, although it could have an important role in the cellular energetic status of cells. Finally, we review some of the scarce literature on the use of epigenetics in animal breeding programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Attention Deficit-Hyperactivity Disorder is a disease that affects 3 to 5 percent of children globally. Many of those live in areas with very few or no medical professionals qualified to help them. To help assuage this problem a system was developed that allows physicians to accompany their patient’s progress and prescribe treatments. These treatments can be drugs or behavioral exercises. The behavioral exercises were designed in the form of games in order to motivate the patients, children, for the treatment. The system allows the patients to play the prescribed games, under the supervision of their tutors. Each game is designed to improve the patient’s handling of their disease through training in a specific mental component. The objective of this approach is to complement the traditional form of treatment by allowing a physician to prescribe therapeutic games and maintaining the patients under supervision between their regular consultations. The main goal of this project is to provide the patients with a better control of their symptoms that with just traditional therapy. Experimental field tests with children and clinical staff, offer promising results. This research is developed in the context of a financed project involving INESC C (Polytechnic Institute of Leiria delegation), the Santo André Hospital of Leiria, and the start-up company PlusrootOne (that owns the project).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of molecular methods offers an alternative faster than traditional methods based on morphology It is nearly impossible to process all the samples in short period using traditional methods, and the deterioration of marine sediments rapidly occurs The dT-RFLP (directed Terminal-Restriction Fragment Length Polymorphism) allows a rapid assessment of biodiversity changes of nematodes assemblages The use of a not suitable fixing, storage time and DNA extraction could be a limitation in molecular analysis like dT-RFLP and real time PCR.Objetives: the best fixative •the level of DNA degradation over the time •the best DNA extraction method for marine nematodes and suitable for dT-RFLP analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The carbonated sedimentation in Ossa-Morena Zone during the Palaeozoic is formed, at least, by two main episodes. However, some chronological questions remain open, due to lack of biostratigraphic data in some carbonates. Sr isotope analysis was performed in selected limestones and marbles of Ossa-Morena Zone, in order to discriminate the Sr signature of the two main carbonate sedimentation episodes. The Sr isotopic data from the analyzed carbonate show two clusters of 87Sr/86Sr ratios, one related with the Lower Cambrian and other with the Lower-Middle Devonian carbonates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Catalytic Arylation Methods – From the Academic Lab to Industrial Processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the concept of constructive paranoia stated by journalist and author Andrés Oppenheimer to promote development in Latin America. Based on that concept, this paper discusses the effectiveness of current English Language Teaching, particularly, as well as what should be done in order to obtain better results. As a conclusion, a re-structure of approach, curriculum and methodology in teaching the language is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Simple Algorithm for Evapotranspiration Retrieving (SAFER) was used to estimate biophysical parameters and theenergy balance components in two different pasture experimental areas, in the São Paulo state, Brazil. The experimentalpastures consist in six rotational (RGS) and three continuous grazing systems (CGS) paddocks. Landsat-8 images from2013 and 2015 dry and rainy seasons were used, as these presented similar hydrological cycle, with 1,600 mm and 1,613mm of annual precipitation, resulting in 19 cloud-free images. Bands 1 to 7 and thermal bands 10 and 11 were used withweather data from a station located nearthe experimental area. NDVI, biomass, evapotranspiration and latent heat flux(λE) temporal values statistically differ CGS from RGS areas. Grazing systems influences the energy partition and theseresults indicate that RGS benefits biomass production, evapotranspiration and the microclimate, due higher LE values.SAFER is a feasible tool to estimate biophysical parameters and energy balance components in pasture and has potentialto discriminate continuous and rotation grazing systems in a temporal analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of Computational Fluid Dynamics based on the Reynolds-Averaged Navier-Stokes equations to the simulation of bluff body aerodynamics has been thoroughly investigated in the past. Although a satisfactory accuracy can be obtained for some urban physics problems their predictive capability is limited to the mean flow properties, while the ability to accurately predict turbulent fluctuations is recognized to be of fundamental importance when dealing with wind loading and pollution dispersion problems. The need to correctly take into account the flow dynamics when such problems are faced has led researchers to move towards scale-resolving turbulence models such as Large Eddy Simulations (LES). The development and assessment of LES as a tool for the analysis of these problems is nowadays an active research field and represents a demanding engineering challenge. This research work has two objectives. The first one is focused on wind loads assessment and aims to study the capabilities of LES in reproducing wind load effects in terms of internal forces on structural members. This differs from the majority of the existing research, where performance of LES is evaluated only in terms of surface pressures, and is done with a view of adopting LES as a complementary design tools alongside wind tunnel tests. The second objective is the study of LES capabilities in calculating pollutant dispersion in the built environment. The validation of LES in this field is considered to be of the utmost importance in order to conceive healthier and more sustainable cities. In order to validate the numerical setup adopted, a systematic comparison between numerical and experimental data is performed. The obtained results are intended to be used in the drafting of best practice guidelines for the application of LES in the urban physics field with a particular attention to wind load assessment and pollution dispersion problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, cities deal with unprecedented pollution and overpopulation problems, and Internet of Things (IoT) technologies are supporting them in facing these issues and becoming increasingly smart. IoT sensors embedded in public infrastructure can provide granular data on the urban environment, and help public authorities to make their cities more sustainable and efficient. Nonetheless, this pervasive data collection also raises high surveillance risks, jeopardizing privacy and data protection rights. Against this backdrop, this thesis addresses how IoT surveillance technologies can be implemented in a legally compliant and ethically acceptable fashion in smart cities. An interdisciplinary approach is embraced to investigate this question, combining doctrinal legal research (on privacy, data protection, criminal procedure) with insights from philosophy, governance, and urban studies. The fundamental normative argument of this work is that surveillance constitutes a necessary feature of modern information societies. Nonetheless, as the complexity of surveillance phenomena increases, there emerges a need to develop more fine-attuned proportionality assessments to ensure a legitimate implementation of monitoring technologies. This research tackles this gap from different perspectives, analyzing the EU data protection legislation and the United States and European case law on privacy expectations and surveillance. Specifically, a coherent multi-factor test assessing privacy expectations in public IoT environments and a surveillance taxonomy are proposed to inform proportionality assessments of surveillance initiatives in smart cities. These insights are also applied to four use cases: facial recognition technologies, drones, environmental policing, and smart nudging. Lastly, the investigation examines competing data governance models in the digital domain and the smart city, reviewing the EU upcoming data governance framework. It is argued that, despite the stated policy goals, the balance of interests may often favor corporate strategies in data sharing, to the detriment of common good uses of data in the urban context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of ancient, undeciphered scripts presents unique challenges, that depend both on the nature of the problem and on the peculiarities of each writing system. In this thesis, I present two computational approaches that are tailored to two different tasks and writing systems. The first of these methods is aimed at the decipherment of the Linear A afraction signs, in order to discover their numerical values. This is achieved with a combination of constraint programming, ad-hoc metrics and paleographic considerations. The second main contribution of this thesis regards the creation of an unsupervised deep learning model which uses drawings of signs from ancient writing system to learn to distinguish different graphemes in the vector space. This system, which is based on techniques used in the field of computer vision, is adapted to the study of ancient writing systems by incorporating information about sequences in the model, mirroring what is often done in natural language processing. In order to develop this model, the Cypriot Greek Syllabary is used as a target, since this is a deciphered writing system. Finally, this unsupervised model is adapted to the undeciphered Cypro-Minoan and it is used to answer open questions about this script. In particular, by reconstructing multiple allographs that are not agreed upon by paleographers, it supports the idea that Cypro-Minoan is a single script and not a collection of three script like it was proposed in the literature. These results on two different tasks shows that computational methods can be applied to undeciphered scripts, despite the relatively low amount of available data, paving the way for further advancement in paleography using these methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negative Stiffness Structures are mechanical systems that require a decrease in the applied force to generate an increase in displacement. They are structures that possess special characteristics such as snap-through and bi-stability. All of these features make them particularly suitable for different applications, such as shock-absorption, vibration isolation and damping. From this point of view, they have risen awareness of their characteristics and, in order to match them to the application needed, a numerical simulation is of great interest. In this regard, this thesis is a continuation of previous studies in a circular negative stiffness structure and aims at refine the numerical model by presenting a new solution. To that end, an investigation procedure is needed. Amongst all of the methods available, root cause analysis was the chosen one to perform the investigation since it provides a clear view of the problem under analysis and a categorization of all the causes behind it. As a result of the cause-effect analysis, the main causes that have influence on the numerical results were obtained. Once all of the causes were listed, solutions to them were proposed and it led to a new numerical model. The numerical model proposed was of nonlinear type of analysis with hexagonal elements and a hyperelastic material model. The results were analyzed through force-displacement curves, allowing for the visualization of the structure’s energy recovery. When compared to the results obtained from the experimental part, it is evident that the trend is similar and the negative stiffness behaviour is present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio Simultaneous Location and Mapping (SLAM) consists of the simultaneous tracking of the target and estimation of the surrounding environment, to build a map and estimate the target movements within it. It is an increasingly exploited technique for automotive applications, in order to improve the localization of obstacles and the target relative movement with respect to them, for emergency situations, for example when it is necessary to explore (with a drone or a robot) environments with a limited visibility, or for personal radar applications, thanks to its versatility and cheapness. Until today, these systems were based on light detection and ranging (lidar) or visual cameras, high-accuracy and expensive approaches that are limited to specific environments and weather conditions. Instead, in case of smoke, fog or simply darkness, radar-based systems can operate exactly in the same way. In this thesis activity, the Fourier-Mellin algorithm is analyzed and implemented, to verify the applicability to Radio SLAM, in which the radar frames can be treated as images and the radar motion between consecutive frames can be covered with registration. Furthermore, a simplified version of that algorithm is proposed, in order to solve the problems of the Fourier-Mellin algorithm when working with real radar images and improve the performance. The INRAS RBK2, a MIMO 2x16 mmWave radar, is used for experimental acquisitions, consisting of multiple tests performed in Lab-E of the Cesena Campus, University of Bologna. The different performances of Fourier-Mellin and its simplified version are compared also with the MatchScan algorithm, a classic algorithm for SLAM systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of Power Electronics, several types of motor control systems have been developed using STM microcontroller and power boards. In both industrial power applications and domestic appliances, power electronic inverters are widely used. Inverters are used to control the torque, speed, and position of the rotor in AC motor drives. An inverter delivers constant-voltage and constant-frequency power in uninterruptible power sources. Because inverter power supplies have a high-power consumption and low transfer efficiency rate, a three-phase sine wave AC power supply was created using the embedded system STM32, which has low power consumption and efficient speed. It has the capacity of output frequency of 50 Hz and the RMS of line voltage. STM32 embedded based Inverter is a power supply that integrates, reduced, and optimized the power electronics application that require hardware system, software, and application solution, including power architecture, techniques, and tools, approaches capable of performance on devices and equipment. Power inverters are currently used and implemented in green energy power system with low energy system such as sensors or microcontroller to perform the operating function of motors and pumps. STM based power inverter is efficient, less cost and reliable. My thesis work was based on STM motor drives and control system which can be implemented in a gas analyser for operating the pumps and motors. It has been widely applied in various engineering sectors due to its ability to respond to adverse structural changes and improved structural reliability. The present research was designed to use STM Inverter board on low power MCU such as NUCLEO with some practical examples such as Blinking LED, and PWM. Then we have implemented a three phase Inverter model with Steval-IPM08B board, which converter single phase 230V AC input to three phase 380 V AC output, the output will be useful for operating the induction motor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Usually, data warehousing populating processes are data-oriented workflows composed by dozens of granular tasks that are responsible for the integration of data coming from different data sources. Specific subset of these tasks can be grouped on a collection together with their relationships in order to form higher- level constructs. Increasing task granularity allows for the generalization of processes, simplifying their views and providing methods to carry out expertise to new applications. Well-proven practices can be used to describe general solutions that use basic skeletons configured and instantiated according to a set of specific integration requirements. Patterns can be applied to ETL processes aiming to simplify not only a possible conceptual representation but also to reduce the gap that often exists between two design perspectives. In this paper, we demonstrate the feasibility and effectiveness of an ETL pattern-based approach using task clustering, analyzing a real world ETL scenario through the definitions of two commonly used clusters of tasks: a data lookup cluster and a data conciliation and integration cluster.