889 resultados para multiple approach
Resumo:
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.
Resumo:
This is a redacted version of the the final thesis. Copyright material has been removed to comply with UK Copyright Law.
Resumo:
The high cost of maize in Kenya is basically driven by East African regional commodity demand forces and agricultural drought. The production of maize, which is a common staple food in Kenya, is greatly affected by agricultural drought. However, calculations of drought risk and impact on maize production in Kenya is limited by the scarcity of reliable rainfall data. The objective of this study was to apply a novel hyperspectral remote sensing method to modelling temporal fluctuations of maize production and prices in five markets in Kenya. SPOT-VEGETATION NDVI time series were corrected for seasonal effects by computing the standardized NDVI anomalies. The maize residual price time series was further related to the NDVI seasonal anomalies using a multiple linear regression modelling approach. The result shows a moderately strong positive relationship (0.67) between residual price series and global maize prices. Maize prices were high during drought periods (i.e. negative NDVI anomalies) and low during wet seasons (i.e. positive NDVI anomalies). This study concludes that NDVI is a good index for monitoring the evolution of maize prices and food security emergency planning in Kenya. To obtain a very strong correlation for the relationship between the wholesale maize price and the global maize price, future research could consider adding other price-driving factors into the regression models.
Resumo:
Evidence-based management of Developmental Coordination Disorder (DCD) in school-age children requires putting into practice the best and most current research findings, including evidence that early identification, self-management, prevention of secondary disability, and enhanced participation are the most appropriate foci of school-based occupational therapy. Partnering for Change (P4C) is a new school-based intervention based upon these principles that has been developed and evaluated in Ontario, Canada over an 8-year period. Our experience to date indicates that its implementation in schools is highly complex with involvement of multiple stakeholders across health and education sectors. In this paper, we describe and reflect upon our team’s experience in using community-based participatory action research, knowledge translation, and implementation science to transform evidence-informed practice with children who have DCD.
Resumo:
In a professional and business-social context such as that of global hotel brands in the United Kingdom, intercultural communication, contacts and relationships are found at the heart of daily operations and of customer service. A large part of the clientele base of hotels in the United Kingdom is formed by individuals who belong to different cultural groups that travel in the country either for leisure or business. At the same time, the global workforce which is recruited in the hotel industry in the United Kingdom is a reality here to stay. Global travelling and labor work mobility are phenomena which have been generated by changes which occur on a socio-economic, cultural and political level due to the phenomenon of globalization. The hotel industry is therefore well acquainted with the essence of different cultures either to be accommodated within hotel premises, as in the case of external customers, or of diversity management where different cultures are recruited in the hotel industry, as in the case of internal customers. This thesis derives from research conducted on eight different global hotel brands in the United Kingdom in particular, with reference to three, four and five star categories. The research aimed to answer the question of how hotels are organized in order to address issues of intercultural communication during customer service and if intercultural barriers arise during the intercultural interaction of hotel staff and global customers. So as to understand how global hotel brands operate the research carried out focused in three main areas relating to each hotel: organizational culture, customer service–customer care and intercultural issues. The study utilized qualitative interviews with hotel management staff and non-management staff from different cultural backgrounds, public space observations between customers and staff during check-in and checkout in the reception area and during dining at the café-bar and restaurant. Thematic analysis was also applied to the official web page of each hotel and to job advertisements to enhance the findings from the interviews and the observations. For the process of analysis of the data interpretive (hermeneutic) phenomenology of Martin Heidegger has been applied. Generally, it was found that hotel staff quite often feel perplexed by how to deal with and how to overcome, for instance, language barriers and religious issues and how to interpret non verbal behaviors or matters on food culture relating to the intercultural aspect of customer service. In addition, it was interesting to find that attention to excellent customer service on the part of hotel staff is a top organizational value and customer care is a priority. Despite that, the participating hotel brands appear to have not yet, realized how intercultural barriers can affect the daily operation of the hotel, the job performance and the psychology of hotel staff. Employees indicated that they were keen to receive diversity training, provided by their organizations, so as to learn about different cultural needs and expand their intercultural skills. The notion of diversity training in global hotel brands is based on the sense that one of the multiple aims of diversity management as a practice and policy in the workplace of hotels is the better understanding of intercultural differences. Therefore global hotel brands can consider diversity training as a practice which will benefit their hotel staff and clientele base at the same time. This can have a distinctive organizational advantage for organizational affairs in the hotel industry, with potential to influence the effectiveness and performance of hotels.
Resumo:
The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.
Resumo:
The nosocomial infections are a growing concern because they affect a large number of people and they increase the admission time in healthcare facilities. Additionally, its diagnosis is very tricky, requiring multiple medical exams. So, this work is focused on the development of a clinical decision support system to prevent these events from happening. The proposed solution is unique once it caters for the explicit treatment of incomplete, unknown, or even contradictory information under a logic programming basis, that to our knowledge is something that happens for the first time.
Resumo:
Remote sensing is a promising approach for above ground biomass estimation, as forest parameters can be obtained indirectly. The analysis in space and time is quite straight forward due to the flexibility of the method to determine forest crown parameters with remote sensing. It can be used to evaluate and monitoring for example the development of a forest area in time and the impact of disturbances, such as silvicultural practices or deforestation. The vegetation indices, which condense data in a quantitative numeric manner, have been used to estimate several forest parameters, such as the volume, basal area and above ground biomass. The objective of this study was the development of allometric functions to estimate above ground biomass using vegetation indices as independent variables. The vegetation indices used were the Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), Simple Ratio (SR) and Soil-Adjusted Vegetation Index (SAVI). QuickBird satellite data, with 0.70 m of spatial resolution, was orthorectified, geometrically and atmospheric corrected, and the digital number were converted to top of atmosphere reflectance (ToA). Forest inventory data and published allometric functions at tree level were used to estimate above ground biomass per plot. Linear functions were fitted for the monospecies and multispecies stands of two evergreen oaks (Quercus suber and Quercus rotundifolia) in multiple use systems, montados. The allometric above ground biomass functions were fitted considering the mean and the median of each vegetation index per grid as independent variable. Species composition as a dummy variable was also considered as an independent variable. The linear functions with better performance are those with mean NDVI or mean SR as independent variable. Noteworthy is that the two better functions for monospecies cork oak stands have median NDVI or median SR as independent variable. When species composition dummy variables are included in the function (with stepwise regression) the best model has median NDVI as independent variable. The vegetation indices with the worse model performance were EVI and SAVI.
Resumo:
The Authors describe first-hand experiences carried out within the framework of selected International projects aimed at developing collaborative research and education using the One Health (OH) approach. Special emphasis is given to SAPUVETNET, a series of projects co-financed under the EU-ALFA program, and aimed to support an International network on Veterinary Public Health (VPH) formed by Veterinary Faculties from Latin-America (LA) and Europe (EU). SAPUVETNET has envisaged a series of objectives/activities aimed at promoting and enhancing VPH research/training and intersectoral collaboration across LA and EU using the OH approach, as well as participating in research and/or education projects/networks under the OH umbrella, namely EURNEGVEC-European Network for Neglected Vectors & Vector-Borne Infections, CYSTINET-European Network on Taeniosis/Cysticercosis, and NEOH-Network for Evaluation of One Health; the latter includes expertise in multiple disciplines (e.g. ecology, economics, human and animal health, epidemiology, social and environmental sciences, etc.) and has the primary purpose of enabling quantitative evaluation of OH initiatives by developing a standardized evaluation protocol. The Authors give also an account of the ongoing creation of OHIN-OH International Network, founded as a spin-off result of SAPUVETNET. Finally, some examples of cooperation development projects characterised by an OH approach are also briefly mentioned.
Resumo:
A laboratory-based methodology was designed to assess the bioreceptivity of glazed tiles. The experimental set-up consisted of multiple steps: manufacturing of pristine and artificially aged glazed tiles, enrichment of phototrophic microorganisms, inoculation of phototrophs on glazed tiles, incubation under optimal conditions and quantification of biomass. In addition, tile intrinsic properties were assessed to determine which material properties contributed to tile bioreceptivity. Biofilm growth and biomass were appraised by digital image analysis, colorimetry and chlorophyll a analysis. SEM, micro-Raman and micro-particle induced X-ray emission analyses were carried out to investigate the biodeteriorating potential of phototrophic microorganisms on the glazed tiles. This practical and multidisciplinary approach showed that the accelerated colonization conditions allowed different types of tile bioreceptivity to be distinguished and to be related to precise characteristics of the material. Aged tiles showed higher bioreceptivity than pristine tiles due to their higher capillarity and permeability. Moreover, biophysical deterioration caused by chasmoendolithic growth was observed on colonized tile surfaces.
Resumo:
In recent years, radars have been used in many applications such as precision agriculture and advanced driver assistant systems. Optimal techniques for the estimation of the number of targets and of their coordinates require solving multidimensional optimization problems entailing huge computational efforts. This has motivated the development of sub-optimal estimation techniques able to achieve good accuracy at a manageable computational cost. Another technical issue in advanced driver assistant systems is the tracking of multiple targets. Even if various filtering techniques have been developed, new efficient and robust algorithms for target tracking can be devised exploiting a probabilistic approach, based on the use of the factor graph and the sum-product algorithm. The two contributions provided by this dissertation are the investigation of the filtering and smoothing problems from a factor graph perspective and the development of efficient algorithms for two and three-dimensional radar imaging. Concerning the first contribution, a new factor graph for filtering is derived and the sum-product rule is applied to this graphical model; this allows to interpret known algorithms and to develop new filtering techniques. Then, a general method, based on graphical modelling, is proposed to derive filtering algorithms that involve a network of interconnected Bayesian filters. Finally, the proposed graphical approach is exploited to devise a new smoothing algorithm. Numerical results for dynamic systems evidence that our algorithms can achieve a better complexity-accuracy tradeoff and tracking capability than other techniques in the literature. Regarding radar imaging, various algorithms are developed for frequency modulated continuous wave radars; these algorithms rely on novel and efficient methods for the detection and estimation of multiple superimposed tones in noise. The accuracy achieved in the presence of multiple closely spaced targets is assessed on the basis of both synthetically generated data and of the measurements acquired through two commercial multiple-input multiple-output radars.
Resumo:
To responsively manage the striped venus clam (Chamelea gallina) fisheries, a multidisciplinary approach has been adopted through the investigation of new and updated biological aspects (e.g. age, growth, reproduction, size at first maturity, fecundity) and the interaction gear- target or non-target species (e.g. reburial ability, survival potential and exerted damage). The striped venus clam is an important socio-economic species in the Italian fishery context, highly regulated by national and international laws aiming at guaranteeing both social and ecological sustainability. Studies on growth and reproduction revealed that the size at first maturity is reached within the first year of life, whereas the present Minimum Conservation Reference Size of 22 mm is reached at two year of age. The annual reproductive cycle, which is driven by rises in seawater temperature and chlorophyll-a concentration, spans during the warmer months (late spring-summer) with multiple spawning events of different intensity occurring over the spawning period, and the number of potentially emitted gametes is positively related to shell size. Reburial tests conducted on undamaged specimens highlighted the ability of clams to rebury in the sediment once discarded, independently from the size. On the other hand, survival experiments in the laboratory and at sea, on both damaged and undamaged individuals, served to demonstrate that the species has a high survival rate, thus supporting the claim that discarded individuals can contribute to restock the natural populations. Moreover, the evaluation and quantification of damage induced by dredging on the discarded macro-benthic fauna living associated with C. gallina highlighted that soft-shelled or soft-bodied species are the most affected by the fishing process and subjected to a higher mortality. All these findings are of pivotal importance to rationally support the management measures to be adopted in the striped venus clam fishery.
Resumo:
This dissertation proposes an analysis of the governance of the European scientific research, focusing on the emergence of the Open Science paradigm: a new way of doing science, oriented towards the openness of every phase of the scientific research process, able to take full advantage of the digital ICTs. The emergence of this paradigm is relatively recent, but in the last years it has become increasingly relevant. The European institutions expressed a clear intention to embrace the Open Science paradigm (eg., think about the European Open Science Cloud, EOSC; or the establishment of the Horizon Europe programme). This dissertation provides a conceptual framework for the multiple interventions of the European institutions in the field of Open Science, addressing the major legal challenges of its implementation. The study investigates the notion of Open Science, proposing a definition that takes into account all its dimensions related to the human and fundamental rights framework in which Open Science is grounded. The inquiry addresses the legal challenges related to the openness of research data, in light of the European Open Data framework and the impact of the GDPR on the context of Open Science. The last part of the study is devoted to the infrastructural dimension of the Open Science paradigm, exploring the e-infrastructures. The focus is on a specific type of computational infrastructure: the High Performance Computing (HPC) facility. The adoption of HPC for research is analysed from the European perspective, investigating the EuroHPC project, and the local perspective, proposing the case study of the HPC facility of the University of Luxembourg, the ULHPC. This dissertation intends to underline the relevance of the legal coordination approach, between all actors and phases of the process, in order to develop and implement the Open Science paradigm, adhering to the underlying human and fundamental rights.
Resumo:
Dynamical models of stellar systems represent a powerful tool to study their internal structure and dynamics, to interpret the observed morphological and kinematical fields, and also to support numerical simulations of their evolution. We present a method especially designed to build axisymmetric Jeans models of galaxies, assumed as stationary and collisionless stellar systems. The aim is the development of a rigorous and flexible modelling procedure of multicomponent galaxies, composed of different stellar and dark matter distributions, and a central supermassive black hole. The stellar components, in particular, are intended to represent different galaxy structures, such as discs, bulges, halos, and can then have different structural (density profile, flattening, mass, scale-length), dynamical (rotation, velocity dispersion anisotropy), and population (age, metallicity, initial mass function, mass-to-light ratio) properties. The theoretical framework supporting the modelling procedure is presented, with the introduction of a suitable nomenclature, and its numerical implementation is discussed, with particular reference to the numerical code JASMINE2, developed for this purpose. We propose an approach for efficiently scaling the contributions in mass, luminosity, and rotational support, of the different matter components, allowing for fast and flexible explorations of the model parameter space. We also offer different methods of the computation of the gravitational potentials associated of the density components, especially convenient for their easier numerical tractability. A few galaxy models are studied, showing internal, and projected, structural and dynamical properties of multicomponent galaxies, with a focus on axisymmetric early-type galaxies with complex kinematical morphologies. The application of galaxy models to the study of initial conditions for hydro-dynamical and $N$-body simulations of galaxy evolution is also addressed, allowing in particular to investigate the large number of interesting combinations of the parameters which determine the structure and dynamics of complex multicomponent stellar systems.
Resumo:
From its domestication until nowadays, the horse has assumed multiple roles in human society. Over time, this condition and the lack of specific regulation have led to the development of different kinds of management systems for this species. This Ph.D. research project aims to investigate horses' welfare in different management practices and housing systems, considering a multidisciplinary approach, taking into account biological function, naturalness, and affective dimension. The results are presented in five articles that evidence risk factors that can mine horse welfare, and examine tools and parameters that can be employed for its assessment. Our research shows the importance of considering the evolutionary history and the species-specific and behavioural needs of horses in their management and housing. Sociality, free movement, diet composition and foraging routine, and the workload that these animals undergo are important factors that should be taken into account. Furthermore, this research has evidenced the importance of employing different parameters (e.g., behaviour, endocrinological parameters, and immune activity) in welfare assessment and proposes the use of horsehair DHEA (dehydroepiandrosterone) as a possible useful additional non-invasive measure for the investigation of long-term stress conditions. Finally, our results underline the importance of considering the affective dimension in welfare research. Recently, Judgement Bias Tests (JBT), which are based on the influence of affective states on the decision-making process, have been widely employed in animal welfare research. However, our studies show that the use of spatial JBT in horses can have some limitations. Still today several management systems do not fulfill species-specific needs of horses, thus the implementation of specific regulations could ameliorate horse welfare. A multidisciplinary approach to welfare assessment is fundamental, but it should be always remembered the individual and its own characteristics, which can influence not only physiological, immunological, and behavioural responses but also emotional and cognitive dimensions.