993 resultados para Reference Area
The utilization bound of non-preemptive rate-monotonic scheduling in controller area networks is 25%
Resumo:
Consider a distributed computer system comprising many computer nodes, each interconnected with a controller area network (CAN) bus. We prove that if priorities to message streams are assigned using rate-monotonic (RM) and if the requested capacity of the CAN bus does not exceed 25% then all deadlines are met.
Resumo:
OBJECTIVE This study investigated the serological status of dogs living in a visceral leishmaniasis-endemic area and its correlation with the parasitological condition of the animals.METHODS Canine humoral response was evaluated using the sera of 134 dogs by enzyme-linked immunosorbent assay and immunohistochemistry to detect parasites in the skin, lymph node, and spleen of the animals. The specific antibodies investigated were IgG, IgG1, IgG2, and IgE.RESULTS According to the parasitological, laboratory, and clinical findings, the dogs were placed into one of four groups: asymptomatic with (AP+, n = 21) or without (AP-, n = 36) Leishmania tissue parasitism and symptomatic with (SP+, n = 52) or without (SP-, n = 25) parasitism. Higher IgG and IgE levels were positively correlated with the infection condition and parasite load, but not with the clinical status. In all groups, total IgG was the predominant antibody, which occurred at the expense of IgG2 instead of IgG1. Most of the infected dogs tested positive for IgG (SP+, 98.1%; AP+, 95.2%), whereas this was not observed with IgE (SP+, 80.8%; AP+, 71.2%). The most relevant finding was the high positivity of the uninfected dogs for Leishmania-specific IgG (SP-, 60.0%; AP-, 44.4%), IgE (SP-, 44.0%; AP-, 27.8%), IgG1 (SP-, 28.0%; AP-, 22.2%), and IgG2 antibodies (SP-, 56.0%; AP-, 41.7%).CONCLUSIONS The serological status of dogs, as determined by any class or subclass of antibodies, did not accurately distinguish dogs infected with L. (L.) infantum chagasifrom uninfected animals. The inaccuracy of the serological result may impair not only the diagnosis, but also epidemiological investigations and strategies for visceral leishmaniasis control. This complex serological scenario occurring in a visceral leishmaniasis-endemic area highlights the challenges associated with canine diagnosis and points out the difficulties experienced by veterinary clinicians and coordinators of control programs.
Resumo:
This technical report is to provide a reference guide to the implementation of the IEEE 802.15.4 protocol in nesC/TinyOS for the MICAz motes. The implementation is provided as a tool that can be used to implement, test and evaluate the current functionalities defined in the protocol standard as well as to enable the development of functionalities not yet implemented and new add-ons to the protocol.
Resumo:
The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).
Resumo:
The performance of the Weather Research and Forecast (WRF) model in wind simulation was evaluated under different numerical and physical options for an area of Portugal, located in complex terrain and characterized by its significant wind energy resource. The grid nudging and integration time of the simulations were the tested numerical options. Since the goal is to simulate the near-surface wind, the physical parameterization schemes regarding the boundary layer were the ones under evaluation. Also, the influences of the local terrain complexity and simulation domain resolution on the model results were also studied. Data from three wind measuring stations located within the chosen area were compared with the model results, in terms of Root Mean Square Error, Standard Deviation Error and Bias. Wind speed histograms, occurrences and energy wind roses were also used for model evaluation. Globally, the model accurately reproduced the local wind regime, despite a significant underestimation of the wind speed. The wind direction is reasonably simulated by the model especially in wind regimes where there is a clear dominant sector, but in the presence of low wind speeds the characterization of the wind direction (observed and simulated) is very subjective and led to higher deviations between simulations and observations. Within the tested options, results show that the use of grid nudging in simulations that should not exceed an integration time of 2 days is the best numerical configuration, and the parameterization set composed by the physical schemes MM5–Yonsei University–Noah are the most suitable for this site. Results were poorer in sites with higher terrain complexity, mainly due to limitations of the terrain data supplied to the model. The increase of the simulation domain resolution alone is not enough to significantly improve the model performance. Results suggest that error minimization in the wind simulation can be achieved by testing and choosing a suitable numerical and physical configuration for the region of interest together with the use of high resolution terrain data, if available.
Resumo:
The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, there were identified five broad selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. After the identification criteria, a survey was elaborated and companies were contacted in order to understand which factors have more weight in their decisions to choose the partners. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP) method or Value Analysis. The goal of the paper it's to supply a selection reference model that can represent an orientation/pattern for a decision making on the suppliers/partners selection process
Resumo:
Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. The thesis addresses the challenges and contributions posed to the application of a mobile collaborative computing environment concept to wireless networks. The goal is to define a reference architecture for high performance mobile applications. Current work is focused on efficient data dissemination on a highly transitive environment, suitable to many mobile applications and also to the reputation and incentive system available on this mobile collaborative computing environment. For this we are improving our already published reputation/incentive algorithm with knowledge from the usage pattern from the eduroam wireless network in the Lisbon area.
Fisioterapia cardiorrespiratória em pacientes vítimas de queimaduras: projeto de intervenção precoce
Resumo:
Mestrado em Fisioterapia
Resumo:
This paper presents a novel phase correction technique for Passive Radar which uses targets of opportunity present in the target area as references. The proposed methodology is quite simple and enables the use of low cost hardware with independent oscillators for the reference and surveillance channels which can be geographically distributed. © 2014 IEEE.
Resumo:
Nowadays, fibre reinforced plastics are used in a wide variety of applications. Apart from the most known reinforcement fibres, like glass or carbon, natural fibres can be seen as an economical alternative. However, some mistrust is yet limiting the use of such materials, being one of the main reasons the inconsistency normally found in their mechanical properties. It should be noticed that these materials are more used for their low density than for their high stiffness. In this work, two different types of reinforced plates were compared: glass reinforced epoxy plate and sisal reinforced epoxy plate. For material characterization purposes, tensile and flexural tests were carried out. Main properties of both materials, like elastic modulus, tensile strength or flexural modulus, are presented and compared with reference values. Afterwards, plates were drilled under two different feed rates: low and high, with two diverse tools: twist and brad type drill, while cutting speed was kept constant. Thrust forces during drilling were monitored. Then, delamination area around the hole was assessed by using digital images that were processed using a computational platform previously developed. Finally, drilled plates were mechanically tested for bearing and open-hole resistance. Results were compared and correlated with the measured delamination. Conclusions contribute to the understanding of natural fibres reinforced plastics as a substitute to glass fibres reinforced plastics, helping on cost reductions without compromising reliability, as well as the consequence of delamination on mechanical resistance of this type of composites.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Conservação e Restauro. Área de especialização: pedra
Resumo:
No literature data above atmospheric pressure could be found for the viscosity of TOTIVI. As a consequence, the present viscosity results could only be compared upon extrapolation of the vibrating wire data to 0.1 MPa. Independent viscosity measurements were performed, at atmospheric pressure, using an Ubbelohde capillary in order to compare with the vibrating wire results, extrapolated by means of the above mentioned correlation. The two data sets agree within +/- 1%, which is commensurate with the mutual uncertainty of the experimental methods. Comparisons of the literature data obtained at atmospheric pressure with the present extrapolated vibrating-wire viscosity measurements have shown an agreement within +/- 2% for temperatures up to 339 K and within +/- 3.3% for temperatures up to 368 K. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
In Part I of the present work we describe the viscosity measurements performed on tris(2-ethylhexyl) trimellitate or 1,2,4-benzenetricarboxylic acid, tris(2-ethylhexyl) ester (TOTM) up to 65 MPa and at six temperatures from (303 to 373)K, using a new vibrating-wire instrument. The main aim is to contribute to the proposal of that liquid as a potential reference fluid for high viscosity, high pressure and high temperature. The present Part II is dedicated to report the density measurements of TOTM necessary, not only to compute the viscosity data presented in Part I, but also as complementary data for the mentioned proposal. The present density measurements were obtained using a vibrating U-tube densimeter, model DMA HP, using model DMA5000 as a reading unit, both instruments from Anton Paar GmbH. The measurements were performed along five isotherms from (293 to 373)K and at eleven different pressures up to 68 MPa. As far as the authors are aware, the viscosity and density results are the first, above atmospheric pressure, to be published for TOTM. Due to TOTM's high viscosity, its density data were corrected for the viscosity effect on the U-tube density measurements. This effect was estimated using two Newtonian viscosity standard liquids, 20 AW and 200 GW. The density data were correlated with temperature and pressure using a modified Tait equation. The expanded uncertainty of the present density results is estimated as +/- 0.2% at a 95% confidence level. Those results were correlated with temperature and pressure by a modified Tait equation, with deviations within +/- 0.25%. Furthermore, the isothermal compressibility, K-T, and the isobaric thermal expansivity, alpha(p), were obtained by derivation of the modified Tait equation used for correlating the density data. The corresponding uncertainties, at a 95% confidence level, are estimated to be less than +/- 1.5% and +/- 1.2%, respectively. No isobaric thermal expansivity and isothermal compressibility for TOTM were found in the literature. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.
Resumo:
The paper proposes a Flexibility Requirements Model and a Factory Templates Framework to support the dynamic Virtual Organization decision-makers in order to reach effective response to the emergent business opportunities ensuring profitability. Through the construction and analysis of the flexibility requirements model, the network managers can achieve and conceive better strategies to model and breed new dynamic VOs. This paper also presents the leagility concept as a new paradigm fit to equip the network management with a hybrid approach that better tackle the performance challenges imposed by the new and competitive business environments.