885 resultados para Application of Radar Technologies in Hydrology
Resumo:
The development of high throughput techniques ('chip' technology) for measurement of gene expression and gene polymorphisms (genomics), and techniques for measuring global protein expression (proteomics) and metabolite profile (metabolomics) are revolutionising life science research, including research in human nutrition. In particular, the ability to undertake large-scale genotyping and to identify gene polymorphisms that determine risk of chronic disease (candidate genes) could enable definition of an individual's risk at an early age. However, the search for candidate genes has proven to be more complex, and their identification more elusive, than previously thought. This is largely due to the fact that much of the variability in risk results from interactions between the genome and environmental exposures. Whilst the former is now very well defined via the Human Genome Project, the latter (e.g. diet, toxins, physical activity) are poorly characterised, resulting in inability to account for their confounding effects in most large-scale candidate gene studies. The polygenic nature of most chronic diseases offers further complexity, requiring very large studies to disentangle relatively weak impacts of large numbers of potential 'risk' genes. The efficacy of diet as a preventative strategy could also be considerably increased by better information concerning gene polymorphisms that determine variability in responsiveness to specific diet and nutrient changes. Much of the limited available data are based on retrospective genotyping using stored samples from previously conducted intervention trials. Prospective studies are now needed to provide data that can be used as the basis for provision of individualised dietary advice and development of food products that optimise disease prevention. Application of the new technologies in nutrition research offers considerable potential for development of new knowledge and could greatly advance the role of diet as a preventative disease strategy in the 21st century. Given the potential economic and social benefits offered, funding for research in this area needs greater recognition, and a stronger strategic focus, than is presently the case. Application of genomics in human health offers considerable ethical and societal as well as scientific challenges. Economic determinants of health care provision are more likely to resolve such issues than scientific developments or altruistic concerns for human health.
Resumo:
In this paper, the global market potential of solar thermal, photovoltaic (PV) and combined photovoltaic/thermal (PV/T) technologies in current time and near future was discussed. The concept of the PV/T and the theory behind the PV/T operation were briefly introduced, and standards for evaluating technical, economic and environmental performance of the PV/T systems were addressed. A comprehensive literature review into R&D works and practical application of the PV/T technology was illustrated and the review results were critically analysed in terms of PV/T type and research methodology used. The major features, current status, research focuses and existing difficulties/barriers related to the various types of PV/T were identified. The research methods, including theoretical analyses and computer simulation, experimental and combined experimental/theoretical investigation, demonstration and feasibility study, as well as economic and environmental analyses, applied into the PV/T technology were individually discussed, and the achievement and problems remaining in each research method category were described. Finally, opportunities for further work to carry on PV/T study were identified. The review research indicated that air/water-based PV/T systems are the commonly used technologies but their thermal removal effectiveness is lower. Refrigerant/heat-pipe-based PV/Ts, although still in research/laboratory stage, could achieve much higher solar conversion efficiencies over the air/water-based systems. However, these systems were found a few technical challenges in practice which require further resolutions. The review research suggested that further works could be undertaken to (1) develop new feasible, economic and energy efficient PV/T systems; (2) optimise the structural/geometrical configurations of the existing PV/T systems; (3) study long term dynamic performance of the PV/T systems; (4) demonstrate the PV/T systems in real buildings and conduct the feasibility study; and (5) carry on advanced economic and environmental analyses. This review research helps finding the questions remaining in PV/T technology, identify new research topics/directions to further improve the performance of the PV/T, remove the barriers in PV/T practical application, establish the standards/regulations related to PV/T design and installation, and promote its market penetration throughout the world.
Resumo:
This thesis focuses on the adaptation of formal education to people’s technology- use patterns, theirtechnology-in-practice, where the ubiquitous use of mobile technologies is central. The research question is: How can language learning practices occuring in informal learning environments be effectively integrated with formal education through the use of mobile technology? The study investigates the technical, pedagogical, social and cultural challenges involved in a design science approach. The thesis consists of four studies. The first study systematises MALL (mobile-assisted language learning) research. The second investigates Swedish and Chinese students’ attitudes towards the use of mobile technology in education. The third examines students’ use of technology in an online language course, with a specific focus on their learning practices in informal learning contexts and their understanding of how this use guides their learning. Based on the findings, a specifically designed MALL application was built and used in two courses. Study four analyses the app use in terms of students’ perceived level of self-regulation and structuration. The studies show that technology itself plays a very important role in reshaping peoples’ attitudes and that new learning methods are coconstructed in a sociotechnical system. Technology’s influence on student practices is equally strong across borders. Students’ established technologies-in-practice guide the ways they approach learning. Hence, designing effective online distance education involves three interrelated elements: technology, information, and social arrangements. This thesis contributes to mobile learning research by offering empirically and theoretically grounded insights that shift the focus from technology design to design of information systems.
Resumo:
Nowadays there are many information technologies that can make a significant difference to support collaborative efforts in the workspace. The role of IT is to support group collaboration by empowering team members with the right capabilities. One way to assess capabilities is through a maturity model. This paper proposes a first version of the Collaboration-Technology Maturity model (CTMM), aiming to serve as a strategic instrument for IT managers to control and manage the adoption of Collaboration Technologies (CITs) among their organizations. Our contribution is both theoretical and practical as we propose a descriptive maturity model. Nevertheless, it is also an application method and assessment instruments. We also completed an empirical evaluation by conducting 89 assessments at Latin American companies of all sizes and industries. This extensive field exercise allowed us to not only evaluate the usefulness of the model and instruments but also investigate CIT adoption patterns in Latin America in an attempt to collect historical data to further evolve CTMM into a comparative model. Responses were used to provide conclusions on CIT adoption in Latin America with respect to three specific backgrounds: the country of origin (region), size (in number of employees) and industry type. The implications of our findings are discussed for practitioners and researchers.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Background: New challenges are rising in the animal protein market, and one of the main world challenges is to produce more in shorter time, with better quality and in a sustainable way. Brazil is the largest beef exporter in volume hence the factors affecting the beef meat chain are of major concern in countrýs economy. An emerging class of biotechnological approaches, the molecular markers, is bringing new perspectives to face these challenges, particularly after the publication of the first complete livestock genome (bovine), which has triggered a massive initiative to put in practice the benefits of the so called the Post-Genomic Era. Review: This article aimed at showing the directions and insights in the application of molecular markers on livestock genetic improvement and reproduction as well at organizing the progress so far, pointing some perspectives of these emerging technologies in Brazilian ruminant production context. An overview on the nature of the main molecular markers explored in ruminant production is provided, which describes the molecular bases and detection approaches available for microsatellites (STR) and single nucleotide polymorphisms (SNP). A topic is dedicated to review the history of association studies between markers and important trait variation in livestock, showing the timeline starting on quantitative trait loci (QTL) identification using STR markers and ending in high resolution SNP panels to proceed whole genome scans for phenotype/genotype association. Also the article organizes this information to reveal how QTL prospection using STR could open ground to the feasibility of marker-assisted selection and why this approach is quickly being replaced by studies involving the application of genome-wide association using SNP research in a new concept called genomic selection. Conclusion: The world's scientific community is dedicating effort and resources to apply SNP information in livestock selection through the development of high density panels for genomic association studies, connecting molecular genetic data with phenotypes of economic interest. Once generated, this information can be used to take decisions in genetic improvement programs by selecting animals with the assistance of molecular markers.
Resumo:
A significant set of information stored in different databases around the world, can be shared through peer-topeer databases. With that, is obtained a large base of knowledge, without the need for large investments because they are used existing databases, as well as the infrastructure in place. However, the structural characteristics of peer-topeer, makes complex the process of finding such information. On the other side, these databases are often heterogeneous in their schemas, but semantically similar in their content. A good peer-to-peer databases systems should allow the user access information from databases scattered across the network and receive only the information really relate to your topic of interest. This paper proposes to use ontologies in peer-to-peer database queries to represent the semantics inherent to the data. The main contribution of this work is enable integration between heterogeneous databases, improve the performance of such queries and use the algorithm of optimization Ant Colony to solve the problem of locating information on peer-to-peer networks, which presents an improve of 18% in results. © 2011 IEEE.
Resumo:
In the Nilo Coelho irrigation scheme, Brazil, the natural vegetation has been replaced by irrigated agriculture, bringing importance for the quantification of the effects on the energy exchanges between the mixed vegetated surfaces and the lower atmosphere. Landsat satellite images and agro-meteorological stations from 1992 to 2011 were used together, for modelling these exchanges. Surface albedo (α0), NDVI and surface temperature (T0) were the basic remote sensing retrieving parameters necessary to calculate the latent heat flux (λE) and the surface resistance to evapotranspiration (rs) on a large scale. The daily net radiation (Rn) was obtained from α0, air temperature (Ta) and short-wave transmissivity (τsw) throughout the slob equation, allowing the quantification of the daily sensible heat flux (H) by residual in the energy balance equation. With a threshold value for rs, it was possible to separate the energy fluxes from crops and natural vegetation. The averaged fractions of Rn partitioned as H and λE, were in average 39 and 67%, respectively. It was observed an increase of the energy used for the evapotranspiration process inside irrigated areas from 51% in 1992 to 80% in 2011, with the ratio λE/Rn presenting an increase of 3 % per year. The tools and models applied in the current research, can subsidize the monitoring of the coupled climate and land use changes effects in irrigation perimeters, being valuable when aiming the sustainability of the irrigated agriculture in the future, avoiding conflicts among different water users. © 2012 SPIE.
Resumo:
The evaluation of technologies employed at the agricultural production system such as crop rotation and soil preparation, both associated with crop-livestock integration, is crucial. Therefore, the aim of the present study was to evaluate the incorporation of lime for three no-tillage systems and cultural managements in system of crop-livestock integration, with emphasis on corn grain yield. The experiment was conducted from January 2003 to April 2005 at Selvíria city, MS, in Dystroferric Red Latosol, clay texture. The experimental design was randomized blocks with split plots consisted of three main treatments, aimed the soil physics conditioning and the incorporation of lime: PD - No-no-tillage; CM - minimum no-tillage, and PC - conventional no-tillage; and of two secondary treatments related to the management: rotation and crop succession, with four replications. Data on agronomic traits of maize were analyzed: plant height, stem diameter, height of the first spike insertion, 100 grains weight and grain yield. The results showed that the maize produced under the system of crop-livestock integration is quite feasible, showing that grain yields are comparable to averages in the region and the different soil physical conditioning and incorporation of lime did not influence the corn yield as well as the cultural managements, rotation and succession, did not affect the maize crop behavior after two years of cultivation.
Resumo:
This PhD thesis has been proposed to validate and then apply innovative analytical methodologies for the determination of compounds with harmful impact on human health, such as biogenic amines and ochratoxin A in wines. Therefore, the influence of production technology (pH, amino acids precursor and use of different malolactic starters) on biogenic amines content in wines was evaluated. An HPLC method for simultaneous determination of amino acids and amines with precolumnderivatization with 9-Fluorenyl-methoxycarbonyl chloride (FMOC-Cl) and UV detection was developed. Initially, the influence of pH, time of derivatization, gradient profile were studied. In order to improve the separation of amino acids and amines and reduce the time of analysis, it was decided to study the influence of different flows and the use of different columns in the chromatographic method. Firstly, a C18 Luna column was used and later two monolithic columns Chromolith in series. It appeared to be suitable for an easy, precise and accurate determination of a relatively large number of amino acids and amines in wines. This method was then applied on different wines produced in the Emilia Romagna region. The investigation permitted to discriminate between red and white wines. Amino acids content is related to the winemaking process. Biogenic amines content in these wines does not represent a possible toxicological problem for human health. The results of the study of influence of technologies and wine composition demonstrated that pH of wines and amino acids content are the most important factors. Particularly wines with pH > 3,5 show higher concentration of biogenic amines than wines with lower pH. The enrichment of wines by nutrients also influences the content of some biogenic amines that are higher in wines added with amino acids precursors. In this study, amino acids and biogenic amines are not statistically affected by strain of lactic acid bacteria inoculated as a starter for malolactic fermentation. An evaluation of different clean-up (SPE-MycoSep; IACs and LLE) and determination methods (HPLC and ELISA) of ochratoxin A was carried out. The results obtained proved that the SPE clean-up are reliable at the same level while the LLE procedures shows lowest recovery. The ELISA method gave a lower determination and a low reproducibility than HPLC method.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
There is a demand for technologies able to assess the perfusion of surgical flaps quantitatively and reliably to avoid ischemic complications. The aim of this study is to test a new high-speed high-definition laser Doppler imaging (LDI) system (FluxEXPLORER, Microvascular Imaging, Lausanne, Switzerland) in terms of preoperative mapping of the vascular supply (perforator vessels) and postoperative flow monitoring. The FluxEXPLORER performs perfusion mapping of an area 9 x 9 cm with a resolution of 256 x 256 pixels within 6 s in high-definition imaging mode. The sensitivity and predictability to localize perforators is expressed by the coincidence of preoperatively assessed LDI high flow spots with intraoperatively verified perforators in nine patients. 18 free flaps are monitored before, during, and after total ischemia. 63% of all verified perforators correspond to a high flow spot, and 38% of all high flow spots correspond to a verified perforator (positive predictive value). All perfused flaps reveal a value of above 221 perfusion units (PUs), and all values obtained in the ischemic flaps are beneath 187 PU. In summary, we conclude that the present LDI system can serve as a reliable, fast, and easy-to-handle tool to detect ischemia in free flaps, whereas perforator vessels cannot be detected appropriately.
Resumo:
The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.
Resumo:
In this project we developed conductive thermoplastic resins by adding varying amounts of three different carbon fillers: carbon black (CB), synthetic graphite (SG) and multi-walled carbon nanotubes (CNT) to a polypropylene matrix for application as fuel cell bipolar plates. This component of fuel cells provides mechanical support to the stack, circulates the gases that participate in the electrochemical reaction within the fuel cell and allows for removal of the excess heat from the system. The materials fabricated in this work were tested to determine their mechanical and thermal properties. These materials were produced by adding varying amounts of single carbon fillers to a polypropylene matrix (2.5 to 15 wt.% Ketjenblack EC-600 JD carbon black, 10 to 80 wt.% Asbury Carbon's Thermocarb TC-300 synthetic graphite, and 2.5 to 15 wt.% of Hyperion Catalysis International's FIBRILTM multi-walled carbon nanotubes) In addition, composite materials containing combinations of these three fillers were produced. The thermal conductivity results showed an increase in both through-plane and in-plane thermal conductivities, with the largest increase observed for synthetic graphite. The Department of Energy (DOE) had previously set a thermal conductivity goal of 20 W/m·K, which was surpassed by formulations containing 75 wt.% and 80 wt.% SG, yielding in-plane thermal conductivity values of 24.4 W/m·K and 33.6 W/m·K, respectively. In addition, composites containing 2.5 wt.% CB, 65 wt.% SG, and 6 wt.% CNT in PP had an in–plane thermal conductivity of 37 W/m·K. Flexural and tensile tests were conducted. All composite formulations exceeded the flexural strength target of 25 MPa set by DOE. The tensile and flexural modulus of the composites increased with higher concentration of carbon fillers. Carbon black and synthetic graphite caused a decrease in the tensile and flexural strengths of the composites. However, carbon nanotubes increased the composite tensile and flexural strengths. Mathematical models were applied to estimate through-plane and in-plane thermal conductivities of single and multiple filler formulations, and tensile modulus of single-filler formulations. For thermal conductivity, Nielsen's model yielded accurate thermal conductivity values when compared to experimental results obtained through the Flash method. For prediction of tensile modulus Nielsen's model yielded the smallest error between the predicted and experimental values. The second part of this project consisted of the development of a curriculum in Fuel Cell and Hydrogen Technologies to address different educational barriers identified by the Department of Energy. By the creation of new courses and enterprise programs in the areas of fuel cells and the use of hydrogen as an energy carrier, we introduced engineering students to the new technologies, policies and challenges present with this alternative energy. Feedback provided by students participating in these courses and enterprise programs indicate positive acceptance of the different educational tools. Results obtained from a survey applied to students after participating in these courses showed an increase in the knowledge and awareness of energy fundamentals, which indicates the modules developed in this project are effective in introducing students to alternative energy sources.
Resumo:
The process of developing a successful stroke rehabilitation methodology requires four key components: a good understanding of the pathophysiological mechanisms underlying this brain disease, clear neuroscientific hypotheses to guide therapy, adequate clinical assessments of its efficacy on multiple timescales, and a systematic approach to the application of modern technologies to assist in the everyday work of therapists. Achieving this goal requires collaboration between neuroscientists, technologists and clinicians to develop well-founded systems and clinical protocols that are able to provide quantitatively validated improvements in patient rehabilitation outcomes. In this article we present three new applications of complementary technologies developed in an interdisciplinary matrix for acute-phase upper limb stroke rehabilitation – functional electrical stimulation, arm robot-assisted therapy and virtual reality-based cognitive therapy. We also outline the neuroscientific basis of our approach, present our detailed clinical assessment protocol and provide preliminary results from patient testing of each of the three systems showing their viability for patient use.