966 resultados para Time complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A very important part of the globally produced energy is consumed in buildings, being an important share frequently used in the HVAC systems. These ones are increasing both in performance and in complexity, taking advantage from the use of the recent advances in mechanical and power electronic devices, particularly in the speed variation field. However the improved efficiency only occurs while the HVAC unit is working in the conditions specified by the manufacturer, otherwise the energy consumption raises to values considerably higher than the nominal ones. The adequate maintenance enforces the system to run on its nominal performance and the contrary has undesirable impact both in the performance and in the system expected life time. Therefore, HVAC field maintenance assumes a very important role in the global building sustainability concept. This work presents some results of an incorrect use of HVAC and the associated electric energy overconsumption that can assume values 50% higher than those that occur when the installation is operated according to the adequate maintenance plan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presented at INForum - Simpósio de Informática (INFORUM 2015). 7 to 8, Sep, 2015. Portugal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presented at INForum - Simpósio de Informática (INFORUM 2015). 7 to 8, Sep, 2015. Portugal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

EMC2 finds solutions for dynamic adaptability in open systems. It provides handling of mixed criticality multicore applications in r eal-time conditions, withscalability and utmost flexibility, full-scale deployment and management of integrated tool chains, through the entire lifecycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

HHV-6 is the etiological agent of Exanthem subitum which is considered the sixth most frequent disease in infancy. In immuno-compromised hosts, reactivation of latent HHV-6 infection may cause severe acute disease. We developed a Sybr Green Real Time PCR for HHV-6 and compared the results with nested conventional PCR. A 214 pb PCR derived fragment was cloned using pGEM-T easy from Promega system. Subsequently, serial dilutions were made in a pool of negative leucocytes from 10-6 ng/µL (equivalent to 2465.8 molecules/µL) to 10-9 (equivalent to 2.46 molecules/µL). Dilutions of the plasmid were amplified by Sybr Green Real Time PCR, using primers HHV3 (5' TTG TGC GGG TCC GTT CCC ATC ATA 3)'and HHV4 (5' TCG GGA TAG AAA AAC CTA ATC CCT 3') and by conventional nested PCR using primers HHV1 (outer): 5'CAA TGC TTT TCT AGC CGC CTC TTC 3'; HHV2 (outer): 5' ACA TCT ATA ATT TTA GAC GAT CCC 3'; HHV3 (inner) and HHV4 (inner) 3'. The detection threshold was determined by plasmid serial dilutions. Threshold for Sybr Green real time PCR was 24.6 molecules/µL and for the nested PCR was 2.46 molecules/µL. We chose the Real Time PCR for diagnosing and quantifying HHV-6 DNA from samples using the new Sybr Green chemistry due to its sensitivity and lower risk of contamination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The painting activity is one of the most complex and important activities in automobile manufacturing. The inherent complexity of the painting activity and the frequent need for repainting usually turn the painting process into a bottleneck in automobile assembly plants, which is reflected in higher operating costs and longer overall cycle times. One possible approach for optimizing the performance of the paint shop is to improve the efficiency of the color planning. This can be accomplished by evaluating the relative merits of a set of vehicle painting plans. Since this problem has a multicriteria nature, we resort to the multicriteria decision analysis (MCDA) methodology to tackle it. A recent trend in the MCDA field is the development of hybrid approaches that are used to achieve operational synergies between different methods. Here we apply, for the first time, an integrated approach that combines the strengths of the analytic hierarchy process (AHP) and the Preference Ranking Organization METHod for Enrichment Evaluations (PROMETHEE), aided by Geometrical Analysis for Interactive Aid (GAIA), to the problem of assessing alternative vehicle painting plans. The management of the assembly plant found the results of value and is currently using them in order to schedule the painting activities such that an enhancement of the operational efficiency of the paint shop is obtained. This efficiency gain has allowed the management to bid for a new automobile model to be assembled at this specific plant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our purposes are to determine the impact of histological factors observed in zero-time biopsies on early post transplant kidney allograft function. We specifically want to compare the semi-quantitative Banff Classification of zero time biopsies with quantification of % cortical area fibrosis. Sixty three zero-time deceased donor allograft biopsies were retrospectively semiquantitatively scored using Banff classification. By adding the individual chronic parameters a Banff Chronic Sum (BCS) Score was generated. Percentage of cortical area Picro Sirius Red (%PSR) staining was assessed and calculated with a computer program. A negative linear regression between %PSR/ GFR at 3 year post-transplantation was established (Y=62.08 +-4.6412X; p=0.022). A significant negative correlation between arteriolar hyalinosis (rho=-0.375; p=0.005), chronic interstitial (rho=0.296; p=0.02) , chronic tubular ( rho=0.276; p=0.04) , chronic vascular (rho= -0.360;P=0.007), BCS (rho=-0.413; p=0.002) and GFR at 3 years were found. However, no correlation was found between % PSR, Ci, Ct or BCS. In multivariate linear regression the negative predictive factors of 3 years GFR were: BCS in histological model; donor kidney age, recipient age and black race in clinical model. The BCS seems a good and easy to perform tool, available to every pathologist, with significant predictive short-term value. The %PSR predicts short term kidney function in univariate study and involves extra-routine and expensive-time work. We think that %PSR must be regarded as a research instrument.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of agent diffusion in biological tissues is very important to understand and characterize the optical clearing effects and mechanisms involved: tissue dehydration and refractive index matching. From measurements made to study the optical clearing, it is obvious that light scattering is reduced and that the optical properties of the tissue are controlled in the process. On the other hand, optical measurements do not allow direct determination of the diffusion properties of the agent in the tissue and some calculations are necessary to estimate those properties. This fact is imposed by the occurrence of two fluxes at optical clearing: water typically directed out of and agent directed into the tissue. When the water content in the immersion solution is approximately the same as the free water content of the tissue, a balance is established for water and the agent flux dominates. To prove this concept experimentally, we have measured the collimated transmittance of skeletal muscle samples under treatment with aqueous solutions containing different concentrations of glucose. After estimating the mean diffusion time values for each of the treatments we have represented those values as a function of glucose concentration in solution. Such a representation presents a maximum diffusion time for a water content in solution equal to the tissue free water content. Such a maximum represents the real diffusion time of glucose in the muscle and with this value we could calculate the corresponding diffusion coefficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-time monitoring applications may be used in a wireless sensor network (WSN) and may generate packet flows with strict quality of service requirements in terms of delay, jitter, or packet loss. When strict delays are imposed from source to destination, the packets must be delivered at the destination within an end-to-end delay (EED) hard limit in order to be considered useful. Since the WSN nodes are scarce both in processing and energy resources, it is desirable that they only transport useful data, as this contributes to enhance the overall network performance and to improve energy efficiency. In this paper, we propose a novel cross-layer admission control (CLAC) mechanism to enhance the network performance and increase energy efficiency of a WSN, by avoiding the transmission of potentially useless packets. The CLAC mechanism uses an estimation technique to preview packets EED, and decides to forward a packet only if it is expected to meet the EED deadline defined by the application, dropping it otherwise. The results obtained show that CLAC enhances the network performance by increasing the useful packet delivery ratio in high network loads and improves the energy efficiency in every network load.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric temperatures characterize Earth as a slow dynamics spatiotemporal system, revealing long-memory and complex behavior. Temperature time series of 54 worldwide geographic locations are considered as representative of the Earth weather dynamics. These data are then interpreted as the time evolution of a set of state space variables describing a complex system. The data are analyzed by means of multidimensional scaling (MDS), and the fractional state space portrait (fSSP). A centennial perspective covering the period from 1910 to 2012 allows MDS to identify similarities among different Earth’s locations. The multivariate mutual information is proposed to determine the “optimal” order of the time derivative for the fSSP representation. The fSSP emerges as a valuable alternative for visualizing system dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last three decades have seen quite dramatic changes the way we modeled time dependent data. Linear processes have been in the center stage in modeling time series. As far as the second order properties are concerned, the theory and the methodology are very adequate.However, there are more and more evidences that linear models are not sufficiently flexible and rich enough for modeling purposes and that failure to account for non-linearities can be very misleading and have undesired consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hepatitis B virus (HBV) is a major cause of chronic liver disease worldwide. Besides genotype, quantitative analysis of HBV infection is extensively used for monitoring disease progression and treatment. Affordable viral load monitoring is desirable in resource-limited settings and it has been already shown to be useful in developing countries for other viruses such as Hepatitis C virus (HCV) and HIV. In this paper, we describe the validation of a real-time PCR assay for HBV DNA quantification with TaqMan chemistry and MGB probes. Primers and probes were designed using an alignment of sequences from all HBV genotypes in order to equally amplify all of them. The assay is internally controlled and was standardized with an international HBV panel. Its efficacy was evaluated comparing the results with two other methods: Versant HBV DNA Assay 3.0 (bDNA, Siemens, NY, USA) and another real-time PCR from a reference laboratory. Intra-assay and inter-assay reproducibilities were determined and the mean of CV values obtained were 0.12 and 0.09, respectively. The assay was validated with a broad dynamic range and is efficient for amplifying all HBV genotypes, providing a good option to quantify HBV DNA as a routine procedure, with a cheap and reliable protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia