951 resultados para Worst-case execution-time
Resumo:
Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.
Resumo:
LIDAR (LIght Detection And Ranging) first return elevation data of the Boston, Massachusetts region from MassGIS at 1-meter resolution. This LIDAR data was captured in Spring 2002. LIDAR first return data (which shows the highest ground features, e.g. tree canopy, buildings etc.) can be used to produce a digital terrain model of the Earth's surface. This dataset consists of 74 First Return DEM tiles. The tiles are 4km by 4km areas corresponding with the MassGIS orthoimage index. This data set was collected using 3Di's Digital Airborne Topographic Imaging System II (DATIS II). The area of coverage corresponds to the following MassGIS orthophoto quads covering the Boston region (MassGIS orthophoto quad ID: 229890, 229894, 229898, 229902, 233886, 233890, 233894, 233898, 233902, 233906, 233910, 237890, 237894, 237898, 237902, 237906, 237910, 241890, 241894, 241898, 241902, 245898, 245902). The geographic extent of this dataset is the same as that of the MassGIS dataset: Boston, Massachusetts Region 1:5,000 Color Ortho Imagery (1/2-meter Resolution), 2001 and was used to produce the MassGIS dataset: Boston, Massachusetts, 2-Dimensional Building Footprints with Roof Height Data (from LIDAR data), 2002 [see cross references].
Resumo:
This dataset consists of 2D footprints of the buildings in the metropolitan Boston area, based on tiles in the orthoimage index (orthophoto quad ID: 229890, 229894, 229898, 229902, 233886, 233890, 233894, 233898, 233902, 237890, 237894, 237898, 237902, 241890, 241894, 241898, 241902, 245898, 245902). This data set was collected using 3Di's Digital Airborne Topographic Imaging System II (DATIS II). Roof height and footprint elevation attributes (derived from 1-meter resolution LIDAR (LIght Detection And Ranging) data) are included as part of each building feature. This data can be combined with other datasets to create 3D representations of buildings and the surrounding environment.
Resumo:
The macroeconomic results achieved by Belarus in 2012 laid bare the weakness and the inefficiency of its economy. Belarus’s GDP and positive trade balance were growing in the first half of last year. However, this trend was reversed when Russia blocked the scheme of extremely lucrative manipulations in the re-export of Russian petroleum products by Belarus and when the demand for potassium fertilisers fell on the global market. It became clear once again that the outdated Belarusian model of a centrally planned economy is unable to generate sustainable growth, and the Belarusian economy needs thorough structural reforms. Nevertheless, President Alyaksandr Lukashenka consistently continues to block any changes in the system and at the same time expects that the economic indicators this year will reach levels far beyond the possibilities of the Belarusian economy. Therefore, there is a risk that the Belarusian government may employ – as they used to do – instruments aimed at artificially stimulating domestic demand, including money creation. This may upset the relative stability of state finances, which the regime managed to achieve last year. The worst case scenario would see a repeat of what happened in 2011, when a serious financial crisis occurred, forcing Minsk to make concessions (including selling the national network of gas pipelines) to Moscow, its only real source of loans. It thus cannot be ruled out that also this time the only way to recover from the slump will be to receive additional loan support and energy subsidies from Russia at the expense of selling further strategic companies to Russian investors.
Resumo:
Atualmente, assiste-se na nossa sociedade a um recurso e uso massivo de equipamentos eletrónicos portáteis. Este facto, aliado à competitividade de mercado, exigiu o desenvolvimento desses equipamentos com o intuito de melhorar a sua gestão de potência e, obter, consequentemente, maior autonomia e rendimento. Assim, na gestão de potência de um SoC são os reguladores de tensão que assumem um papel de extrema importância. O trabalho realizado ao longo da presente dissertação pressupõe o projeto de um regulador linear de tensão do tipo LDO em tecnologia HV-CMOS, capaz de suportar tensões de entrada de 12V com vista à alimentação de blocos funcionais RF-CMOS com 3,3V e uma corrente de 100mA. Foi implementado através do processo CMOS de 0.35μm de 50V da Austria Micro Systems. A corrente de quiescente do regulador linear de tensão que determina a eficiência de corrente é de 120,22μA. Possui uma eficiência de corrente de 99,88% e um rendimento de 82,46% quando a tensão mínima de entrada é utilizada. O regulador linear de tensão possui uma tensão de dropout de 707mV. A estabilidade do sistema é mantida mesmo com transições de carga de 10μA para 100mA. O regulador possui um tempo de estabelecimento inferior a 2,4μs e uma variação da tensão de saída relativamente ao seu valor nominal inferior a 18mV, ambos para o pior caso. Porém, este regulador possui um undershoot e um overshoot de +- 1,85V.
Resumo:
The triggering mechanism and the temporal evolution of large flood events, especially of worst-case scenarios, are not yet fully understood. Consequently, the cumulative losses of extreme floods are unknown. To study the link between weather conditions, discharges and flood losses it is necessary to couple atmospheric, hydrological, hydrodynamic and damage models. The objective of the M-AARE project is to test the potentials and opportunities of a model chain that relates atmospheric conditions to flood losses or risks. The M-AARE model chain is a set of coupled models consisting of four main components: the precipitation module, the hydrology module, the hydrodynamic module, and the damage module. The models are coupled in a cascading framework with harmonized time-steps. First exploratory applications show that the one way coupling of the WRF-PREVAH-BASEMENT models has been achieved and provides promising new insights for a better understanding of key aspects in flood risk analysis.
Resumo:
Partial information leakage in deterministic public-key cryptosystems refers to a problem that arises when information about either the plaintext or the key is leaked in subtle ways. Quite a common case is where there are a small number of possible messages that may be sent. An attacker may be able to crack the scheme simply by enumerating all the possible ciphertexts. Two methods are proposed for facing the partial information leakage problem in RSA that incorporate a random element into the encrypted message to increase the number of possible ciphertexts. The resulting scheme is, effectively, an RSA-like cryptosystem which exhibits probabilistic encryption. The first method involves encrypting several similar messages with RSA and then using the Quadratic Residuosity Problem (QRP) to mark the intended one. In this way, an adversary who has correctly guessed two or more of the ciphertexts is still in doubt about which message is the intended one. The cryptographic strength of the combined system is equal to the computational difficulty of factorising a large integer; ideally, this should be feasible. The second scheme uses error-correcting codes for accommodating the random component. The plaintext is processed with an error-correcting code and deliberately corrupted before encryption. The introduced corruption lies within the error-correcting ability of the code, so as to enable the recovery of the original message. The random corruption offers a vast number of possible ciphertexts corresponding to a given plaintext; hence an attacker cannot deduce any useful information from it. The proposed systems are compared to other cryptosystems sharing similar characteristics, in terms of execution time and ciphertext size, so as to determine their practical utility. Finally, parameters which determine the characteristics of the proposed schemes are also examined.
Resumo:
The need to improve the management of language learning organizations in the light of the trend toward mass higher education and of the use of English as a world language was the starting point of this thesis. The thesis aims to assess the relevance, adequacy and the relative success of Total Quality Management (TQM) as a management philosophy. Taking this empirical evidence a TQM-oriented management project in a Turkish Higher Education context, the thesis observes the consequences of a change of organizational culture, with specific reference to teachers' attitudes towards management. Both qualitative and quantitative devices are employed to plot change and the value of these devices for identifying such is considered. The main focus of the thesis is the Soft S's (Shared Values, Style, Staff and Skills) of an organization rather than the Hard S's (System, Structure, Strategy). The thesis is not concerned with the teaching and learning processes, though the PDCA cycle (the Action Research Cycle) did play a part in the project for both teachers and the researcher involved in this study of organizational development. Both before the management project was launched, and at the end of the research period, the external measurement devices (Harrison's Culture Specification Device and Hofstede's VSM) were used to describe the culture of the Centre. During the management project, internal measurement devices were used to record the change including middle-management style change (the researcher in this case). The time period chosen for this study was between September 1991 and June 1994. During this period, each device was administered twice within a specific time period, ranging from a year to 32 months.
Resumo:
Models at runtime can be defined as abstract representations of a system, including its structure and behaviour, which exist in tandem with the given system during the actual execution time of that system. Furthermore, these models should be causally connected to the system being modelled, offering a reflective capability. Significant advances have been made in recent years in applying this concept, most notably in adaptive systems. In this paper we argue that a similar approach can also be used to support the dynamic generation of software artefacts at execution time. An important area where this is relevant is the generation of software mediators to tackle the crucial problem of interoperability in distributed systems. We refer to this approach as emergent middleware, representing a fundamentally new approach to resolving interoperability problems in the complex distributed systems of today. In this context, the runtime models are used to capture meta-information about the underlying networked systems that need to interoperate, including their interfaces and additional knowledge about their associated behaviour. This is supplemented by ontological information to enable semantic reasoning. This paper focuses on this novel use of models at runtime, examining in detail the nature of such runtime models coupled with consideration of the supportive algorithms and tools that extract this knowledge and use it to synthesise the appropriate emergent middleware.
Resumo:
Polarization diverse optical phase conjugation of a 1THz spectral-band 1.14Tb/s DP-QPSK WDM multiplex is demonstrated for the first time, showing a worst case Q2 penalty of 0.9dB over all conjugate wavelengths, polarizations and OSNR. © 2014 OSA.
Resumo:
We present a study of the influence of dispersion induced phase noise for CO-OFDM systems using FFT multiplexing/IFFT demultiplexing techniques (software based). The software based system provides a method for a rigorous evaluation of the phase noise variance caused by Common Phase Error (CPE) and Inter-Carrier Interference (ICI) including - for the first time to our knowledge - in explicit form the effect of equalization enhanced phase noise (EEPN). This, in turns, leads to an analytic BER specification. Numerical results focus on a CO-OFDM system with 10-25 GS/s QPSK channel modulation. A worst case constellation configuration is identified for the phase noise influence and the resulting BER is compared to the BER of a conventional single channel QPSK system with the same capacity as the CO-OFDM implementation. Results are evaluated as a function of transmission distance. For both types of systems, the phase noise variance increases significantly with increasing transmission distance. For a total capacity of 400 (1000) Gbit/s, the transmission distance to have the BER < 10-2 for the worst case CO-OFDM design is less than 800 and 460 km, respectively, whereas for a single channel QPSK system it is less than 1400 and 560 km.
Resumo:
In this talk we investigate the usage of spectrally shaped amplified spontaneous emission (ASE) in order to emulate highly dispersed wavelength division multiplexed (WDM) signals in an optical transmission system. Such a technique offers various simplifications to large scale WDM experiments. Not only does it offer a reduction in transmitter complexity, removing the need for multiple source lasers, it potentially reduces the test and measurement complexity by requiring only the centre channel of a WDM system to be measured in order to estimate WDM worst case performance. The use of ASE as a test and measurement tool is well established in optical communication systems and several measurement techniques will be discussed [1, 2]. One of the most prevalent uses of ASE is in the measurement of receiver sensitivity where ASE is introduced in order to degrade the optical signal to noise ratio (OSNR) and measure the resulting bit error rate (BER) at the receiver. From an analytical point of view noise has been used to emulate system performance, the Gaussian Noise model is used as an estimate of highly dispersed signals and has had consider- able interest [3]. The work to be presented here extends the use of ASE by using it as a metric to emulate highly dispersed WDM signals and in the process reduce WDM transmitter complexity and receiver measurement time in a lab environment. Results thus far have indicated [2] that such a transmitter configuration is consistent with an AWGN model for transmission, with modulation format complexity and nonlinearities playing a key role in estimating the performance of systems utilising the ASE channel emulation technique. We conclude this work by investigating techniques capable of characterising the nonlinear and damage limits of optical fibres and the resultant information capacity limits. REFERENCES McCarthy, M. E., N. Mac Suibhne, S. T. Le, P. Harper, and A. D. Ellis, “High spectral efficiency transmission emulation for non-linear transmission performance estimation for high order modulation formats," 2014 European Conference on IEEE Optical Communication (ECOC), 2014. 2. Ellis, A., N. Mac Suibhne, F. Gunning, and S. Sygletos, “Expressions for the nonlinear trans- mission performance of multi-mode optical fiber," Opt. Express, Vol. 21, 22834{22846, 2013. Vacondio, F., O. Rival, C. Simonneau, E. Grellier, A. Bononi, L. Lorcy, J. Antona, and S. Bigo, “On nonlinear distortions of highly dispersive optical coherent systems," Opt. Express, Vol. 20, 1022-1032, 2012.
Resumo:
This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.
Resumo:
Males and age group 1 to 5 years show a much higher risk for childhood acute lymphoblastic leukemia (ALL). We performed a case-only genome-wide association study (GWAS), using the Illumina Infinium HumanCoreExome Chip, to unmask gender- and age-specific risk variants in 240 non-Hispanic white children with ALL recruited at Texas Children’s Cancer Center, Houston, Texas. Besides statistically most significant results, we also considered results that yielded the highest effect sizes. Existing experimental data and bioinformatic predictions were used to complement results, and to examine the biological significance of statistical results. Our study identified novel risk variants for childhood ALL. The SNP, rs4813720 (RASSF2), showed the statistically most significant gender-specific associations (P < 2 x 10-6). Likewise, rs10505918 (SOX5) yielded the lowest P value (P < 1 x 10-5) for age-specific associations, and also showed the statistically most significant association with age-at-onset (P < 1 x 10-4). Two SNPs, rs12722042 and 12722039, from the HLA-DQA1 region yielded the highest effect sizes (odds ratio (OR) = 15.7; P = 0.002) for gender-specific results, and the SNP, rs17109582 (OR = 12.5; P = 0.006), showed the highest effect size for age-specific results. Sex chromosome variants did not appear to be involved in gender-specific associations. The HLA-DQA1 SNPs belong to DQA1*01:07and confirmed previously reported male-specific association with DQA1*01:07. Twenty one of the SNPs identified as risk markers for gender- or age-specific associations were located in the transcription factor binding sites and 56 SNPs were non-synonymous variants, likely to alter protein function. Although bioinformatic analysis did not implicate a particular mechanism for gender- and age-specific associations, RASSF2 has an estrogen receptor-alpha binding site in its promoter. The unknown mechanisms may be due to lack of interest in gender- and age-specificity in associations. These results provide a foundation for further studies to examine the gender- and age-differential in childhood ALL risk. Following replication and mechanistic studies, risk factors for one gender or age group may have a potential to be used as biomarkers for targeted intervention for prevention and maybe also for treatment.
Resumo:
The Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease characterized by progressive muscle weakness that leads the patient to death, usually due to respiratory complications. Thus, as the disease progresses the patient will require noninvasive ventilation (NIV) and constant monitoring. This paper presents a distributed architecture for homecare monitoring of nocturnal NIV in patients with ALS. The implementation of this architecture used single board computers and mobile devices placed in patient’s homes, to display alert messages for caregivers and a web server for remote monitoring by the healthcare staff. The architecture used a software based on fuzzy logic and computer vision to capture data from a mechanical ventilator screen and generate alert messages with instructions for caregivers. The monitoring was performed on 29 patients for 7 con-tinuous hours daily during 5 days generating a total of 126000 samples for each variable monitored at a sampling rate of one sample per second. The system was evaluated regarding the rate of hits for character recognition and its correction through an algorithm for the detection and correction of errors. Furthermore, a healthcare team evaluated regarding the time intervals at which the alert messages were generated and the correctness of such messages. Thus, the system showed an average hit rate of 98.72%, and in the worst case 98.39%. As for the message to be generated, the system also agreed 100% to the overall assessment, and there was disagreement in only 2 cases with one of the physician evaluators.