856 resultados para Population set-based methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The estimating of the relative orientation and position of a camera is one of the integral topics in the field of computer vision. The accuracy of a certain Finnish technology company’s traffic sign inventory and localization process can be improved by utilizing the aforementioned concept. The company’s localization process uses video data produced by a vehicle installed camera. The accuracy of estimated traffic sign locations depends on the relative orientation between the camera and the vehicle. This thesis proposes a computer vision based software solution which can estimate a camera’s orientation relative to the movement direction of the vehicle by utilizing video data. The task was solved by using feature-based methods and open source software. When using simulated data sets, the camera orientation estimates had an absolute error of 0.31 degrees on average. The software solution can be integrated to be a part of the traffic sign localization pipeline of the company in question.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O crescimento exponencial da população mundial que assentou essencialmente na utilização de combustíveis fósseis e o facto de aproximadamente ¾ da população mundial viver em cidades, criou uma situação insustentável de emissões de CO2 para a atmosfera. No sentido de minimizar e tentar reverter este comportamento a União Europeia criou o programa Pacto dos Autarcas, que visa o comprometimento de um território em reduzir as suas emissões de CO2 em, pelo menos, 20% até 2020, através do chamado Plano de Ação. O presente trabalho pretende desenvolver um possível Plano de Ação para a Sustentabilidade Energética para o município de Vale de Cambra. Da realização da matriz energética para este município foi inventariado o consumo de 47.154 tep e respectivas emissões de CO2 associada de 108.084 toneladas, para o ano de referência 2012. Face ao estudo compreendido entre 2002 e 2012 são previstos consumos e emissões na ordem de 54.327 tep e 123.059 toneladas de CO2 respetivamente até 2020 caso não sejam tomadas medidas para inverter essa tendência. Baseado na interpretação da matriz energética criada, propõe-se um conjunto de ações nas áreas da indústria, mobilidade, edifícios, energias renováveis, eficiência energética, governabilização e ainda sensibilização/formação para obter uma redução de 20% das emissões e consumo em 2020, passando para 98.447 toneladas de CO2 emitidas e 43.462 tep respetivamente. Conclui-se que o planeamento energético de um território é de elevada importância porque permite articular vários aspectos, conseguindo aumentar a protecção do ambiente (com a redução de emissões de gases efeito de estufa) e aumentando a eficiência energética do território, aumenta a sua competitividade económica e consequentemente maior oportunidade de investimentos externos, criando mais oportunidades de emprego e bem-estar social.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the ecosystem level, sustainable exploitation of fisheries resources depends not only on the status of target species but also on that of bycatch species, some of which are even more sensitive to exploitation. This is the case for a number of elasmobranchs (skates, rays and sharks) species whose abundance declined during the 20th century. Further, the biology of elamobranchs is still poorly known and traditional fisheries stock assessment methods using fisheries catches and scientific survey data for estimating abundance are expensive or even inapplicable due to the small numbers observed. The GenoPopTaille project attempts to apply to the case of the thornback ray (Raja clavata) recent genetic-based methods for absolute population abundance estimation as well as characterizing its genetic diversity and population structure in the Northeast Atlantic. The poster will present the objectives, challenges and progress made so far by the project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uma avaliação das metodologias de análise e recolha de dados aplicadas pelo Programa NOCTUAPortugal é de extrema importância para se apurar se estas são as mais indicadas em estudos de citizen science. Comparou-se os resultados de diferentes metodologias analíticas de estimação das tendências populacionais das espécies de aves noturnas durante o período de realização do Programa NOCTUA-Portugal (análise gráfica simples, modelos lineares generalizados (GLM-Poisson e GLMM), modelos aditivos generalizados (GAM-LOESS e GAM-mgcv) e software TRIM). Analisou-se a metodologia de censo de modo a avaliar o número de registos face à duração dos pontos de escuta, comparar a eficiência do ponto de deteção com outros estudos, variação das respostas ao longo da noite e efeito da época do ano, vento, nebulosidade e luminosidade da lua. Os resultados mostraram que a metodologia analítica mais indicada era o GLMM e que não era necessário realizar nenhum ajuste em particular na metodologia de censo; Trends in nocturnal birds in Portugal Methods and analysis of a volunteer-based monitoring program ABSTRACT: An evaluation of the methodologies of analysis and data collection applied by NOCTUA-Portugal Program is extremely important to determine whether these are the most suitable in citizen science studies. We compared the results of different analytical methodologies to estimate population trends of the species of nocturnal birds during the period of the NOCTUA-Portugal Program (simple graphical analysis, generalized linear models (GLM-Poisson and GLMM), generalized additive models (GAM-LOESS and GAMmgcv) and software TRIM). We analyzed the field methodology to assess the effect of point duration on the number of records, compared the point count efficiency with other sources, the variation of responses throughout the night, the effect of time of year, wind, cloud cover and moon luminosity. The results showed that the most suitable analytical methodology was the GLMM and it was not necessary to make any particular adjustment in the field methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motion planning, or trajectory planning, commonly refers to a process of converting high-level task specifications into low-level control commands that can be executed on the system of interest. For different applications, the system will be different. It can be an autonomous vehicle, an Unmanned Aerial Vehicle(UAV), a humanoid robot, or an industrial robotic arm. As human machine interaction is essential in many of these systems, safety is fundamental and crucial. Many of the applications also involve performing a task in an optimal manner within a given time constraint. Therefore, in this thesis, we focus on two aspects of the motion planning problem. One is the verification and synthesis of the safe controls for autonomous ground and air vehicles in collision avoidance scenarios. The other part focuses on the high-level planning for the autonomous vehicles with the timed temporal constraints. In the first aspect of our work, we first propose a verification method to prove the safety and robustness of a path planner and the path following controls based on reachable sets. We demonstrate the method on quadrotor and automobile applications. Secondly, we propose a reachable set based collision avoidance algorithm for UAVs. Instead of the traditional approaches of collision avoidance between trajectories, we propose a collision avoidance scheme based on reachable sets and tubes. We then formulate the problem as a convex optimization problem seeking control set design for the aircraft to avoid collision. We apply our approach to collision avoidance scenarios of quadrotors and fixed-wing aircraft. In the second aspect of our work, we address the high level planning problems with timed temporal logic constraints. Firstly, we present an optimization based method for path planning of a mobile robot subject to timed temporal constraints, in a dynamic environment. Temporal logic (TL) can address very complex task specifications such as safety, coverage, motion sequencing etc. We use metric temporal logic (MTL) to encode the task specifications with timing constraints. We then translate the MTL formulae into mixed integer linear constraints and solve the associated optimization problem using a mixed integer linear program solver. We have applied our approach on several case studies in complex dynamical environments subjected to timed temporal specifications. Secondly, we also present a timed automaton based method for planning under the given timed temporal logic specifications. We use metric interval temporal logic (MITL), a member of the MTL family, to represent the task specification, and provide a constructive way to generate a timed automaton and methods to look for accepting runs on the automaton to find an optimal motion (or path) sequence for the robot to complete the task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cassava root is the main staple for 70% of the population in Mozambique, particularly in inaccessible rural areas, but is known to be low in iron. Anaemia is a public health problem in mothers and preschool children in Mozambique and up to 40% of these cases are probably due to dietary iron deficiency. The World Health Organization (WHO) and Food and Agriculture Organization of the United Nations (FAO) recognize the fortification of foodstuff as an effective method to remedy dietary deficiencies of micronutrients, including iron. Cassava mahewu, a non-alcoholic fermented beverage is prepared at subsistence level from cassava roots using indigenous procedures. The aim of the study was to standardize mahewu fermentation and investigate if the type of cassava fermented, or the iron compound used for fortification affected the final product. Roots of sweet and bitter varieties of cassava from four districts (Rapale, Meconta, Alto Molocue and Zavala) in Mozambique, were peeled, dried and pounded to prepare flour. Cassava flour was cooked and fermented under controlled conditions (45°C for 24 h). The fermentation period and temperature were set, based on the findings of a pilot study which showed that an end-point pH of about 4.5 was regularly reached after 24 h at 45°C. Cassava mahewu was fortified with ferrous sulfate (FeSO4.7H2O) or ferrous fumarate (C4H2FeO4) at the beginning (time zero) and at the end of fermentation (24 h). The amount of iron added to the mahewu was based on the average of the approved range of iron used for the fortification of maize meal. The mean pH at the endpoint was 4.5, with 0.29% titratable acidity. The pH and acidity were different to those reported in previous studies on maize mahewu, whereas the solid extract of 9.65% was found to be similar. Lactic acid bacteria (LAB) and yeast growth were not significantly different in mahewu fortified with either of the iron compounds. There was no significant difference between cassava mahewu made from bitter or sweet varieties. A standard method for preparation and iron fortification of cassava mahewu was developed. It is recommended that fortification occurs at the end of fermentation when done at household level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Premature cardiovascular events have been observed in systemic lupus erythematosus (SLE) patients, but the reason for this accelerated process is still debatable; although traditional risk factors are more prevalent in such patients than in the general population, the do not seem to fully explain that enhanced risk. One of the most important conditions is a proatherogenic lipid proile. There is not enough data about it in Mexican SLE patients. Objective: To establish the differences in the lipid proiles between Mexican patients with SLE and the general population. Material and methods: Observational, transversal, descriptive and comparative study, between SLE patients and age-sex-matched healthy volunteers. We performed a full lipid proile (by spectrophotometry) 14 hours of fast. The results obtained were analyzed by the statistical program SPSS® Statistics version 17. Results: We studied the full lipid proiles of 138 subjects, 69 with a diagnosis of SLE and 69 agesex- matched healthy volunteers; 95.7% were females and 4.3% males. Average age was 30 years; average body mass index (BMI) 25.96 ± 5.96 kg/m² in SLE patients and 26.72 ± 4.36 kg/m² in the control group (p = 0.396). Average of total cholesterol 156 mg/dl in the SLE patients and 169.4 mg/dl in the control group (p =0.028); average of low density lipoprotein (LDL) cholesterol 85.27 mg/dl in the SLE patients and 97.57 mg/dl in the control group (p = 0.023). Conclusions: We did not ind statistical differences in the lipid proiles among patients and healthy volunteers, which could explain increased cardiovascular morbidity and mortality observed in SLE patients

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado Vinifera Euromaster - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With recent advances in remote sensing processing technology, it has become more feasible to begin analysis of the enormous historic archive of remotely sensed data. This historical data provides valuable information on a wide variety of topics which can influence the lives of millions of people if processed correctly and in a timely manner. One such field of benefit is that of landslide mapping and inventory. This data provides a historical reference to those who live near high risk areas so future disasters may be avoided. In order to properly map landslides remotely, an optimum method must first be determined. Historically, mapping has been attempted using pixel based methods such as unsupervised and supervised classification. These methods are limited by their ability to only characterize an image spectrally based on single pixel values. This creates a result prone to false positives and often without meaningful objects created. Recently, several reliable methods of Object Oriented Analysis (OOA) have been developed which utilize a full range of spectral, spatial, textural, and contextual parameters to delineate regions of interest. A comparison of these two methods on a historical dataset of the landslide affected city of San Juan La Laguna, Guatemala has proven the benefits of OOA methods over those of unsupervised classification. Overall accuracies of 96.5% and 94.3% and F-score of 84.3% and 77.9% were achieved for OOA and unsupervised classification methods respectively. The greater difference in F-score is a result of the low precision values of unsupervised classification caused by poor false positive removal, the greatest shortcoming of this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.