788 resultados para statistical modelling, wind effects, signal propagation, wireless sensor networks
Resumo:
Les changements climatiques prennent une importance grandissante dans l’étude des phénomènes spatiaux à grande échelle. Plusieurs experts affirment que les changements climatiques seront un des principaux moteurs de changement écologique dans les prochaines décennies et que leurs conséquences seront inévitables. Ces changements se manifesteront sur le milieu physique par la fonte des calottes glaciaires, le dégel du pergélisol, l’instabilité des versants montagneux en zone de pergélisol, l’augmentation de l’intensité, de la sévérité et de la fréquence des événements climatiques extrêmes tels les feux de forêt. Les changements climatiques se manifesteront aussi sur le milieu biologique, tel la modification de la durée de la saison végétative, l’augmentation des espèces exotiques invasives et les changements dans la distribution en espèces vivantes. Deux aspects sont couverts par cette étude : 1) les changements dans la répartition spatiale de 39 espèces d’oiseaux et 2) les modifications dans les patrons spatiaux des feux, en forêt boréale québécoise, tous deux dans l’horizon climatique de 2100. Une approche de modélisation statistique démontre que la répartition spatiale des oiseaux de la forêt boréale est fortement liée à des variables bioclimatiques (R2adj = 0.53). Ces résultats permettent d’effectuer des modélisations bioclimatiques pour le gros-bec errant et la mésange à tête noire quivoient une augmentation de la limite nordique de distribution de l’espèce suivant l’intensité du réchauffement climatique. Finalement, une modélisation spatialement explicite par automate cellulaire permet de démontrer comment les changements climatiques induiront une augmentation dans la fréquence de feux de forêt et dans la superficie brûlée en forêt boréale du Québec.
Resumo:
In this paper, microstrip lines magnetically coupled to splitring resonators (SRRs) are conquved to electromagnetic bundgup (EBG) nr,rrostrip lines in terns q/ their stop-heard penjbrnmrnce and dimensions. In bath types o/ trunsmis•siou lines, signal propagation is inhibited in it certain jequency bwuL For EBG microstrip lines, the central frequency of such a forbidden band is determined by the period of the structure, whereas in SRR-hased microstrip lines the position of the frequency gap depends on the quasi-static resonant frequency of the rings. The main relevant conrributiun of this paper is to provide a tuning procedure to control the gap width in SRR microstrip lines, and to show that by using SRRs, device dimensions ale much smaller than those required by EBGs in order to obtain similar stop-banal performance. This has been demonstrated by fill-wave electromagnetic simulations and experimentally verified from the characterization ql two fabricated microstrip lines: one with rectangular SRRs etched on the upper substrate side, and the other with a periodic perturbation cf'strip width. For similar rejection and 1-(;H,. gap width centered at 4.5 Gllz, it has been found that the SRR microstrip line is•,fve times shorter. In addition, no ripple is appreciable in the allowed band for the .SRR-hared structure, whereas due to dispersion, certain mismatch is expected in the EBG prototype. Due to the high-frequency selectivity, controllable gap width, and small dimensions, it is believed that SRR coupled to planar transmission lines can have an actual impact on the design of stop-band filters compatible with planar technology, and can be an alternative to present solutions based on distributed approaches or EBG
Resumo:
Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.
Resumo:
In any discipline, where uncertainty and variability are present, it is important to have principles which are accepted as inviolate and which should therefore drive statistical modelling, statistical analysis of data and any inferences from such an analysis. Despite the fact that two such principles have existed over the last two decades and from these a sensible, meaningful methodology has been developed for the statistical analysis of compositional data, the application of inappropriate and/or meaningless methods persists in many areas of application. This paper identifies at least ten common fallacies and confusions in compositional data analysis with illustrative examples and provides readers with necessary, and hopefully sufficient, arguments to persuade the culprits why and how they should amend their ways
Resumo:
Global climate change and its impacts are being increasingly studied and precipitation trends are one of the measures of quantifying climate change especially in the tropics. This study uses daily rainfall data to determine if there are changes in the long-term trends in rainfall variability in the East Coast Mountains of Mauritius during the last few decades, and to investigate the factors influencing the trends in the inter-annual to inter-decadal rainfall variability. Statistical modelling has been used to investigate the trends in total seasonal rainfall, the number of rain days and the mean amount of rain per rainy days and the local, regional and large-scale factors that affect them on inter-annual to inter-decadal time scales. The strongest inter-decadal trend was found in the number of rain days for both rainfall seasons, and the other variables were found to have weak or insignificant trends. Both local factors, such as the surrounding sea surface temperatures and large-scale phenomena such as Indian Monsoon and the El Niño Southern Oscillation were found to influence rainfall patterns.
Resumo:
This review considers microbial inocula used in in vitro systems from the perspective of their ability to degrade or ferment a particular substrate, rather than the microbial species that it contains. By necessity, this required an examination of bacterial, protozoal and fungal populations of the rumen and hindgut with respect to factors influencing their activity. The potential to manipulate these populations through diet or sampling time are examined, as is inoculum preparation and level. The main alternatives to fresh rumen fluid (i.e., caecal digesta or faeces) are discussed with respect to end-point degradabilities and fermentation dynamics. Although the potential to use rumen contents obtained from donor animals at slaughter offers possibilities, the requirement to store it and its subsequent loss of activity are limitations. Statistical modelling of data, although still requiring a deal of developmental work, may offer an alternative approach. Finally, with respect to the range of in vitro methodologies and equipment employed, it is suggested that a degree of uniformity could be obtained through generation of a set of guidelines relating to the host animal, sampling technique and inoculum preparation. It was considered unlikely that any particular system would be accepted as the 'standard' procedure. However, before any protocol can be adopted, additional data are required (e.g., a method to assess inoculum 'quality' with respect to its fermentative and/or degradative activity), preparation/inoculation techniques need to be refined and a methodology to store inocula without loss of efficacy developed. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.
Resumo:
Networks are ubiquitous in natural, technological and social systems. They are of increasing relevance for improved understanding and control of infectious diseases of plants, animals and humans, given the interconnectedness of today's world. Recent modelling work on disease development in complex networks shows: the relative rapidity of pathogen spread in scale-free compared with random networks, unless there is high local clustering; the theoretical absence of an epidemic threshold in scale-free networks of infinite size, which implies that diseases with low infection rates can spread in them, but the emergence of a threshold when realistic features are added to networks (e.g. finite size, household structure or deactivation of links); and the influence on epidemic dynamics of asymmetrical interactions. Models suggest that control of pathogens spreading in scale-free networks should focus on highly connected individuals rather than on mass random immunization. A growing number of empirical applications of network theory in human medicine and animal disease ecology confirm the potential of the approach, and suggest that network thinking could also benefit plant epidemiology and forest pathology, particularly in human-modified pathosystems linked by commercial transport of plant and disease propagules. Potential consequences for the study and management of plant and tree diseases are discussed.
Resumo:
There have been few rigorous assessments of the effectiveness of participatory processes for natural resource management. In Bangladesh an approach known as Participatory Action Plan Development (PAPD) has been developed and applied. By combining problem identification and solution analysis by separate stakeholder groups with plenary sessions it is claimed to result in consensus and more effective community based management. Methodological issues in assessing the effectiveness of such development are discussed and good practice illustrated. Under the same project there were sites where PAPD had been used and others without its use so a comparative assessment could be made. However, for an appropriate assessment it is important to identify clear testable hypotheses regarding the expected benefits, appropriate measures, and other factors which may affect or confound the outcome. The paper illustrates how participatory assessment involving both individual opinions and focus groups can be systematically recorded, quantified and used with other data in statistical analysis. By using statistical modelling methods at an appropriate level of aggregation and controlling for other factors, benefits from PAPD were found to be significant. The systematic approaches and practices recommended from this example can be applied in similar situations to test the effectiveness of participatory processes using participatory assessments.
Resumo:
A wireless sensor network (WSN) is a group of sensors linked by wireless medium to perform distributed sensing tasks. WSNs have attracted a wide interest from academia and industry alike due to their diversity of applications, including home automation, smart environment, and emergency services, in various buildings. The primary goal of a WSN is to collect data sensed by sensors. These data are characteristic of being heavily noisy, exhibiting temporal and spatial correlation. In order to extract useful information from such data, as this paper will demonstrate, people need to utilise various techniques to analyse the data. Data mining is a process in which a wide spectrum of data analysis methods is used. It is applied in the paper to analyse data collected from WSNs monitoring an indoor environment in a building. A case study is given to demonstrate how data mining can be used to optimise the use of the office space in a building.
Resumo:
This paper addresses the impact of imperfect synchronisation on D-STBC when combined with incremental relay. To suppress such an impact, a novel detection scheme is proposed, which retains the two key features of the STBC principle: simplicity (i.e. linear computational complexity), and optimality (i.e. maximum likelihood). These two features make the new detector very suitable for low power wireless networks (e.g. sensor networks).
Resumo:
The improvements obtained on cooling atmospheric remote-sensing instruments for space flight applications has promoted research in characterization of the necessary optical filters. By modelling the effects of temperature on the dispersive spectrum of some constituent thin film materials, the cooled performance can be simulated and compared. multilayer filter designs with the measured spectra from actual filters. Two actual filters are discussed, for the 7µm region, one a composite cut-on/cut-off design of 13% HBW and the other an integral narrowband design of 4% HBW.
Resumo:
Multi-rate multicarrier DS/CDMA is a potentially attractive multiple access method for future wireless communications networks that must support multimedia, and thus multi-rate, traffic. Several receiver structures exist for single-rate multicarrier systems, but little has been reported on multi-rate multicarrier systems. Considering that high-performance detection such as coherent demodulation needs the explicit knowledge of the channel, based on the finite-length chip waveform truncation, this paper proposes a subspace-based scheme for timing and channel estimation in multi-rate multicarrier DS/CDMA systems, which is applicable to both multicode and variable spreading factor systems. The performance of the proposed scheme for these two multi-rate systems is validated via numerical simulations. The effects of the finite-length chip waveform truncation on the performance of the proposed scheme is also analyzed theoretically.
Resumo:
Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.
Resumo:
Background: In mammals, early-life environmental variations appear to affect microbial colonization and therefore competent immune development, and exposure to farm environments in infants has been inversely correlated with allergy development. Modelling these effects using manipulation of neonatal rodents is difficult due to their dependency on the mother, but the relatively independent piglet is increasingly identified as a valuable translational model for humans. This study was designed to correlate immune regulation in piglets with early-life environment. Methods: Piglets were nursed by their mother on a commercial farm, while isolatorreared siblings were formula fed. Fluorescence immunohistology was used to quantify T-reg and effector T-cell populations in the intestinal lamina propria and the systemic response to food proteins was quantified by capture ELISA. Results: There was more CD4+ and CD4+CD25+ effector T-cell staining in the intestinal mucosa of the isolator-reared piglets compared with their farm-reared counterparts. In contrast, these isolator-reared piglets had a significantly reduced CD4+CD25+Foxp3+ regulatory T-cell population compared to farm-reared littermates, resulting in a significantly higher T-reg-to-effector ratio in the farm animals. Consistent with these findings, isolator-reared piglets had an increased serum IgG anti-soya response to novel dietary soya protein relative to farm-reared piglets. Conclusion: Here, we provide the first direct evidence, derived from intervention, that components of the early-life environment present on farms profoundly affects both local development of regulatory components of the mucosal immune system and immune responses to food proteins at weaning. We propose that neonatal piglets provide a tractable model which allows maternal and treatment effects to be statistically separated.