879 resultados para Computation time delay


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A constante evolução da tecnologia disponibilizou, atualmente, ferramentas computacionais que eram apenas expectativas há 10 anos atrás. O aumento do potencial computacional aplicado a modelos numéricos que simulam a atmosfera permitiu ampliar o estudo de fenômenos atmosféricos, através do uso de ferramentas de computação de alto desempenho. O trabalho propôs o desenvolvimento de algoritmos com base em arquiteturas SIMT e aplicação de técnicas de paralelismo com uso da ferramenta OpenACC para processamento de dados de previsão numérica do modelo Weather Research and Forecast. Esta proposta tem forte conotação interdisciplinar, buscando a interação entre as áreas de modelagem atmosférica e computação científica. Foram testadas a influência da computação do cálculo de microfísica de nuvens na degradação temporal do modelo. Como a entrada de dados para execução na GPU não era suficientemente grande, o tempo necessário para transferir dados da CPU para a GPU foi maior do que a execução da computação na CPU. Outro fator determinante foi a adição de código CUDA dentro de um contexto MPI, causando assim condições de disputa de recursos entre os processadores, mais uma vez degradando o tempo de execução. A proposta do uso de diretivas para aplicar computação de alto desempenho em uma estrutura CUDA parece muito promissora, mas ainda precisa ser utilizada com muita cautela a fim de produzir bons resultados. A construção de um híbrido MPI + CUDA foi testada, mas os resultados não foram conclusivos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This document presents GEmSysC, an unified cryptographic API for embedded systems. Software layers implementing this API can be built over existing libraries, allowing embedded software to access cryptographic functions in a consistent way that does not depend on the underlying library. The API complies to good practices for API design and good practices for embedded software development and took its inspiration from other cryptographic libraries and standards. The main inspiration for creating GEmSysC was the CMSIS-RTOS standard, which defines an unified API for embedded software in an implementation-independent way, but targets operating systems instead of cryptographic functions. GEmSysC is made of a generic core and attachable modules, one for each cryptographic algorithm. This document contains the specification of the core of GEmSysC and three of its modules: AES, RSA and SHA-256. GEmSysC was built targeting embedded systems, but this does not restrict its use only in such systems – after all, embedded systems are just very limited computing devices. As a proof of concept, two implementations of GEmSysC were made. One of them was built over wolfSSL, which is an open source library for embedded systems. The other was built over OpenSSL, which is open source and a de facto standard. Unlike wolfSSL, OpenSSL does not specifically target embedded systems. The implementation built over wolfSSL was evaluated in a Cortex- M3 processor with no operating system while the implementation built over OpenSSL was evaluated on a personal computer with Windows 10 operating system. This document displays test results showing GEmSysC to be simpler than other libraries in some aspects. These results have shown that both implementations incur in little overhead in computation time compared to the cryptographic libraries themselves. The overhead of the implementation has been measured for each cryptographic algorithm and is between around 0% and 0.17% for the implementation over wolfSSL and between 0.03% and 1.40% for the one over OpenSSL. This document also presents the memory costs for each implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Policy and decision makers dealing with environmental conservation and land use planning often require identifying potential sites for contributing to minimize sediment flow reaching riverbeds. This is the case of reforestation initiatives, which can have sediment flow minimization among their objectives. This paper proposes an Integer Programming (IP) formulation and a Heuristic solution method for selecting a predefined number of locations to be reforested in order to minimize sediment load at a given outlet in a watershed. Although the core structure of both methods can be applied for different sorts of flow, the formulations are targeted to minimization of sediment delivery. The proposed approaches make use of a Single Flow Direction (SFD) raster map covering the watershed in order to construct a tree structure so that the outlet cell corresponds to the root node in the tree. The results obtained with both approaches are in agreement with expert assessments of erosion levels, slopes and distances to the riverbeds, which in turn allows concluding that this approach is suitable for minimizing sediment flow. Since the results obtained with the IP formulation are the same as the ones obtained with the Heuristic approach, an optimality proof is included in the present work. Taking into consideration that the heuristic requires much less computation time, this solution method is more suitable to be applied in large sized problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The benefits obtained from mating are usually condition-dependent, favouring the evolution of flexible investment during copulation, for instance, in terms of invested time, energy, or sperm. Flexible investment strategies are predicted to depend on the likelihood of acquiring alternative mates and therefore they should depend on the timing of mate encounter. However, scarce experimental evidence for this hypothesis exists. Here we manipulated the time delay until first mating and the interval between first and second mating in the polygynandrous common lizard, Zootoca vivipara. We determined treatment effects on fertilisation success and copulation duration, the latter being a proxy for investment in mating and for quantity of transferred sperm. The duration of the second copulation decreased with increasing inter-mating interval and depended on the fertilisation success of first mates. The former provides evidence for time-dependent investment strategies, most likely resulting from the progression of the female's reproductive cycle. Fertilisation success of first mates increased with increasing inter-mating interval and was higher when females were closer to ovulation, showing that flexible investment strategies significantly affected male reproductive success. This points to fertilisation assurance, which may mitigate negative effects of low population density on reproductive success, e.g. Allee effects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A computer vision system that has to interact in natural language needs to understand the visual appearance of interactions between objects along with the appearance of objects themselves. Relationships between objects are frequently mentioned in queries of tasks like semantic image retrieval, image captioning, visual question answering and natural language object detection. Hence, it is essential to model context between objects for solving these tasks. In the first part of this thesis, we present a technique for detecting an object mentioned in a natural language query. Specifically, we work with referring expressions which are sentences that identify a particular object instance in an image. In many referring expressions, an object is described in relation to another object using prepositions, comparative adjectives, action verbs etc. Our proposed technique can identify both the referred object and the context object mentioned in such expressions. Context is also useful for incrementally understanding scenes and videos. In the second part of this thesis, we propose techniques for searching for objects in an image and events in a video. Our proposed incremental algorithms use the context from previously explored regions to prioritize the regions to explore next. The advantage of incremental understanding is restricting the amount of computation time and/or resources spent for various detection tasks. Our first proposed technique shows how to learn context in indoor scenes in an implicit manner and use it for searching for objects. The second technique shows how explicitly written context rules of one-on-one basketball can be used to sequentially detect events in a game.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Invasive candidiasis (IC) is an opportunistic systemic mycosis caused by Candida species (commonly Candida albicans) that continues to pose a significant public health problem worldwide. Despite great advances in antifungal therapy and changes in clinical practices, IC remains a major infectious cause of morbidity and mortality in severely immunocompromised or critically ill patients, and further accounts for substantial healthcare costs. Its impact on patient clinical outcome and economic burden could be ameliorated by timely initiation of appropriate antifungal therapy. However, early detection of IC is extremely difficult because of its unspecific clinical signs and symptoms, and the inadequate accuracy and time delay of the currently available diagnostic or risk stratification methods. In consequence, the diagnosis of IC is often attained in advanced stages of infection (leading to delayed therapeutic interventions and ensuing poor clinical outcomes) or, unfortunately, at autopsy. In addition to the difficulties encountered in diagnosing IC at an early stage, the initial therapeutic decision-making process is also hindered by the insufficient accuracy of the currently available tools for predicting clinical outcomes in individual IC patients at presentation. Therefore, it is not surprising that clinicians are generally unable to early detect IC, and identify those IC patients who are most likely to suffer fatal clinical outcomes and may benefit from more personalized therapeutic strategies at presentation. Better diagnostic and prognostic biomarkers for IC are thus needed to improve the clinical management of this life-threatening and costly opportunistic fungal infection...

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tesis versa sobre el an álisis de la forma de objetos 2D. En visión articial existen numerosos aspectos de los que se pueden extraer información. Uno de los más usados es la forma o el contorno de esos objetos. Esta característica visual de los objetos nos permite, mediante el procesamiento adecuado, extraer información de los objetos, analizar escenas, etc. No obstante el contorno o silueta de los objetos contiene información redundante. Este exceso de datos que no aporta nuevo conocimiento debe ser eliminado, con el objeto de agilizar el procesamiento posterior o de minimizar el tamaño de la representación de ese contorno, para su almacenamiento o transmisión. Esta reducción de datos debe realizarse sin que se produzca una pérdida de información importante para representación del contorno original. Se puede obtener una versión reducida de un contorno eliminando puntos intermedios y uniendo los puntos restantes mediante segmentos. Esta representación reducida de un contorno se conoce como aproximación poligonal. Estas aproximaciones poligonales de contornos representan, por tanto, una versión comprimida de la información original. El principal uso de las mismas es la reducción del volumen de información necesario para representar el contorno de un objeto. No obstante, en los últimos años estas aproximaciones han sido usadas para el reconocimiento de objetos. Para ello los algoritmos de aproximaci ón poligonal se han usado directamente para la extracci ón de los vectores de caracter ísticas empleados en la fase de aprendizaje. Las contribuciones realizadas por tanto en esta tesis se han centrado en diversos aspectos de las aproximaciones poligonales. En la primera contribución se han mejorado varios algoritmos de aproximaciones poligonales, mediante el uso de una fase de preprocesado que acelera estos algoritmos permitiendo incluso mejorar la calidad de las soluciones en un menor tiempo. En la segunda contribución se ha propuesto un nuevo algoritmo de aproximaciones poligonales que obtiene soluciones optimas en un menor espacio de tiempo que el resto de métodos que aparecen en la literatura. En la tercera contribución se ha propuesto un algoritmo de aproximaciones que es capaz de obtener la solución óptima en pocas iteraciones en la mayor parte de los casos. Por último, se ha propuesto una versi ón mejorada del algoritmo óptimo para obtener aproximaciones poligonales que soluciona otro problema de optimización alternativo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Doutoramento em Economia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis details the design and applications of a terahertz (THz) frequency comb spectrometer. The spectrometer employs two offset locked Ti:Sapphire femtosecond oscillators with repetition rates of approximately 80 MHz, offset locked at 100 Hz to continuously sample a time delay of 12.5 ns at a maximum time delay resolution of 15.6 fs. These oscillators emit continuous pulse trains, allowing the generation of a THz pulse train by the master, or pump, oscillator and the sampling of this THz pulse train by the slave, or probe, oscillator via the electro-optic effect. Collecting a train of 16 consecutive THz pulses and taking the Fourier transform of this pulse train produces a decade-spanning frequency comb, from 0.25 to 2.5 THz, with a comb tooth width of 5 MHz and a comb tooth spacing of ~80 MHz. This frequency comb is suitable for Doppler-limited rotational spectroscopy of small molecules. Here, the data from 68 individual scans at slightly different pump oscillator repetition rates were combined, producing an interleaved THz frequency comb spectrum, with a maximum interval between comb teeth of 1.4 MHz, enabling THz frequency comb spectroscopy.

The accuracy of the THz frequency comb spectrometer was tested, achieving a root mean square error of 92 kHz measuring selected absorption center frequencies of water vapor at 10 mTorr, and a root mean square error of 150 kHz in measurements of a K-stack of acetonitrile. This accuracy is sufficient for fitting of measured transitions to a model Hamiltonian to generate a predicted spectrum for molecules of interest in the fields of astronomy and physical chemistry. As such, the rotational spectra of methanol and methanol-OD were acquired by the spectrometer. Absorptions from 1.3 THz to 2.0 THz were compared to JPL catalog data for methanol and the spectrometer achieved an RMS error of 402 kHz, improving to 303 kHz when excluding low signal-to-noise absorptions. This level of accuracy compares favorably with the ~100 kHz accuracy achieved by JPL frequency multiplier submillimeter spectrometers. Additionally, the relative intensity performance of the THz frequency comb spectrometer is linear across the entire decade-spanning bandwidth, making it the preferred instrument for recovering lineshapes and taking absolute intensity measurements in the THz region. The data acquired by the Terahertz Frequency Comb Spectrometer for methanol-OD is of comparable accuracy to the methanol data and may be used to refine the fit parameters for the predicted spectrum of methanol-OD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, there has been an enormous growth of location-aware devices, such as GPS embedded cell phones, mobile sensors and radio-frequency identification tags. The age of combining sensing, processing and communication in one device, gives rise to a vast number of applications leading to endless possibilities and a realization of mobile Wireless Sensor Network (mWSN) applications. As computing, sensing and communication become more ubiquitous, trajectory privacy becomes a critical piece of information and an important factor for commercial success. While on the move, sensor nodes continuously transmit data streams of sensed values and spatiotemporal information, known as ``trajectory information". If adversaries can intercept this information, they can monitor the trajectory path and capture the location of the source node. This research stems from the recognition that the wide applicability of mWSNs will remain elusive unless a trajectory privacy preservation mechanism is developed. The outcome seeks to lay a firm foundation in the field of trajectory privacy preservation in mWSNs against external and internal trajectory privacy attacks. First, to prevent external attacks, we particularly investigated a context-based trajectory privacy-aware routing protocol to prevent the eavesdropping attack. Traditional shortest-path oriented routing algorithms give adversaries the possibility to locate the target node in a certain area. We designed the novel privacy-aware routing phase and utilized the trajectory dissimilarity between mobile nodes to mislead adversaries about the location where the message started its journey. Second, to detect internal attacks, we developed a software-based attestation solution to detect compromised nodes. We created the dynamic attestation node chain among neighboring nodes to examine the memory checksum of suspicious nodes. The computation time for memory traversal had been improved compared to the previous work. Finally, we revisited the trust issue in trajectory privacy preservation mechanism designs. We used Bayesian game theory to model and analyze cooperative, selfish and malicious nodes' behaviors in trajectory privacy preservation activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis investigates the rotational behavior of abstracted small-wind-turbine rotors exposed to a sudden increase in oncoming flow velocity, i.e. a gust. These rotors consisted of blades with aspect ratios characteristic of samara seeds, which are known for their ability to maintain autorotation in unsteady wind. The models were tested in a towing tank using a custom-built experimental rig. The setup was designed and constructed to allow for the measurement of instantaneous angular velocity of a rotor model towed at a prescribed kinematic profile along the tank. The conclusions presented in this thesis are based on the observed trends in effective angle-of-attack distribution, tip speed ratio, angular velocity, and time delay in the rotational response for each of rotors over prescribed gust cases. It was found that the blades with the higher aspect ratio had higher tip speed ratios and responded faster than the blades with a lower aspect ratio. The decrease in instantaneous tip speed ratio during the onset of a prescribed gust correlated with the time delay in each rotor model's rotational response. The time delays were found to increase nonlinearly with decreasing durations over which the simulated gusts occurred.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main goal of this paper is to expose and validate a methodology to design efficient automatic controllers for irrigation canals, based on the Saint-Venant model. This model-based methodology enables to design controllers at the design stage (when the canal is not already built). The methodology is applied on an experimental canal located in Portugal. First the full nonlinear PDE model is calibrated, using a single steady-state experiment. The model is then linearized around a functioning point, in order to design linear PI controllers. Two classical control strategies are tested (local upstream control and distant downstream control) and compared on the canal. The experimental results show the effectiveness of the model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

T2Well-ECO2M is a coupled wellbore reservoir simulator still under development at Lawrence Berkeley National Laboratory (USA) with the ability to deal with a mixture of H2O-CO2-NaCl and includes the simulation of CO2 phase transition and multiphase flow. The code was originally developed for the simulation of CO2 injection into deep saline aquifers and the modelling of enhanced geothermal systems; however, the focus of this research was to modify and test T2Well-ECO2M to simulate CO2 injection into depleted gas reservoirs. To this end, the original code was properly changed in a few parts and a dedicated injection case was developed to study CO2 phase transition inside of a wellbore and the corresponding thermal effects. In the first scenario, the injection case was run applying the fully numerical approach of wellbore to formation heat exchange calculation. Results were analysed in terms of wellbore pressure and temperature vertical profiles, wellhead and bottomhole conditions, and characteristic reservoir displacement fronts. Special attention was given to the thorough analysis of bottomhole temperature as the critical parameter for hydrate formation. Besides the expected direct effect of wellbore temperature changes on reservoir conditions, the simulation results indicated also the effect of CO2 phase change in the near wellbore zone on BH pressure distribution. To test the implemented software changes, in a second scenario, the same injection case was reproduced using the improved semi-analytical time-convolution approach for wellbore to formation heat exchange calculation. The comparison of the two scenarios showed that the simulation of wellbore and reservoir parameters after one year of continuous CO2 injection are in good agreement with the computation time to solve the time-convolution semi-analytical reduced. The new updated T2Well-ECO2M version has shown to be a robust and performing wellbore-reservoir simulator that can be also used to simulate the CO2 injection into depleted gas reservoirs.