908 resultados para Loading constraint


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reports the results of a survey of the pearl oyster industry in French Polynesia territory. Its purpose is to examine the perceptions of the priorities for the development of this industry towards sustainable development. These perceptions were apprehended by a survey of pearl oyster farmers and other stakeholders of the sector (management authorities, scientists). After describing the methodological protocol of these investigations, it comes to confront the priorities chosen by professionals (i.e. pearl farmers) concerning sustainable development, with the perceptions of others stakeholders in the sector. Secondly it comes to build a typology of the priorities of pearl farmers concerning sustainable development. This analysis enables the assessment of the degree of convergence within the sector, which is the base material for defining a shared action plan at the territory scale. This is the first study compiling data of surveys of various professionals and stakeholders of the pearl farming industry in such a large area in French Polynesia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. In February-March 2014, the MAGIC telescopes observed the high-frequency peaked BL Lac 1ES 1011+496 (z=0.212) in flaring state at very-high energy (VHE, E>100GeV). The flux reached a level more than 10 times higher than any previously recorded flaring state of the source. Aims. Description of the characteristics of the flare presenting the light curve and the spectral parameters of the night-wise spectra and the average spectrum of the whole period. From these data we aim at detecting the imprint of the Extragalactic Background Light (EBL) in the VHE spectrum of the source, in order to constrain its intensity in the optical band. Methods. We analyzed the gamma-ray data from the MAGIC telescopes using the standard MAGIC software for the production of the light curve and the spectra. For the constraining of the EBL we implement the method developed by the H.E.S.S. collaboration in which the intrinsic energy spectrum of the source is modeled with a simple function (< 4 parameters), and the EBL-induced optical depth is calculated using a template EBL model. The likelihood of the observed spectrum is then maximized, including a normalization factor for the EBL opacity among the free parameters. Results. The collected data allowed us to describe the flux changes night by night and also to produce di_erential energy spectra for all nights of the observed period. The estimated intrinsic spectra of all the nights could be fitted by power-law functions. Evaluating the changes in the fit parameters we conclude that the spectral shape for most of the nights were compatible, regardless of the flux level, which enabled us to produce an average spectrum from which the EBL imprint could be constrained. The likelihood ratio test shows that the model with an EBL density 1:07 (-0.20,+0.24)stat+sys, relative to the one in the tested EBL template (Domínguez et al. 2011), is preferred at the 4:6 σ level to the no-EBL hypothesis, with the assumption that the intrinsic source spectrum can be modeled as a log-parabola. This would translate into a constraint of the EBL density in the wavelength range [0.24 μm,4.25 μm], with a peak value at 1.4 μm of λF_ = 12:27^(+2:75)_ (-2:29) nW m^(-2) sr^(-1), including systematics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Bacillus subtilis DnaI, DnaB and DnaD proteins load the replicative ring helicase DnaC onto DNA during priming of DNA replication. Here we show that DnaI consists of a C-terminal domain (Cd) with ATPase and DNA-binding activities and an N-terminal domain (Nd) that interacts with the replicative ring helicase. A Zn2+-binding module mediates the interaction with the helicase and C67, C70 and H84 are involved in the coordination of the Zn2+. DnaI binds ATP and exhibits ATPase activity that is not stimulated by ssDNA, because the DNA-binding site on Cd is masked by Nd. The ATPase activity resides on the Cd domain and when detached from the Nd domain, it becomes sensitive to stimulation by ssDNA because its cryptic DNA-binding site is exposed. Therefore, Nd acts as a molecular 'switch' regulating access to the ssDNA binding site on Cd, in response to binding of the helicase. DnaI is sufficient to load the replicative helicase from a complex with six DnaI molecules, so there is no requirement for a dual helicase loader system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Projeto de Graduação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Licenciada em Fisioterapia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solving a complex Constraint Satisfaction Problem (CSP) is a computationally hard task which may require a considerable amount of time. Parallelism has been applied successfully to the job and there are already many applications capable of harnessing the parallel power of modern CPUs to speed up the solving process. Current Graphics Processing Units (GPUs), containing from a few hundred to a few thousand cores, possess a level of parallelism that surpasses that of CPUs and there are much less applications capable of solving CSPs on GPUs, leaving space for further improvement. This paper describes work in progress in the solving of CSPs on GPUs, CPUs and other devices, such as Intel Many Integrated Cores (MICs), in parallel. It presents the gains obtained when applying more devices to solve some problems and the main challenges that must be faced when using devices with as different architectures as CPUs and GPUs, with a greater focus on how to effectively achieve good load balancing between such heterogeneous devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of Computational Fluid Dynamics based on the Reynolds-Averaged Navier-Stokes equations to the simulation of bluff body aerodynamics has been thoroughly investigated in the past. Although a satisfactory accuracy can be obtained for some urban physics problems their predictive capability is limited to the mean flow properties, while the ability to accurately predict turbulent fluctuations is recognized to be of fundamental importance when dealing with wind loading and pollution dispersion problems. The need to correctly take into account the flow dynamics when such problems are faced has led researchers to move towards scale-resolving turbulence models such as Large Eddy Simulations (LES). The development and assessment of LES as a tool for the analysis of these problems is nowadays an active research field and represents a demanding engineering challenge. This research work has two objectives. The first one is focused on wind loads assessment and aims to study the capabilities of LES in reproducing wind load effects in terms of internal forces on structural members. This differs from the majority of the existing research, where performance of LES is evaluated only in terms of surface pressures, and is done with a view of adopting LES as a complementary design tools alongside wind tunnel tests. The second objective is the study of LES capabilities in calculating pollutant dispersion in the built environment. The validation of LES in this field is considered to be of the utmost importance in order to conceive healthier and more sustainable cities. In order to validate the numerical setup adopted, a systematic comparison between numerical and experimental data is performed. The obtained results are intended to be used in the drafting of best practice guidelines for the application of LES in the urban physics field with a particular attention to wind load assessment and pollution dispersion problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Computational Wind Engineering, CWE, for solving wind-related problems is still a challenging task today, mainly due to the high computational cost required to obtain trustworthy simulations. In particular, the Large Eddy Simulation, LES, has been widely used for evaluating wind loads on buildings. The present thesis assesses the capability of LES as a design tool for wind loading predictions through three cases. The first case is using LES for simulating the wind field around a ground-mounted rectangular prism in Atmospheric Boundary Layer (ABL) flow. The numerical results are validated with experimental results for seven wind attack angles, giving a global understanding of the model performance. The case with the worst model behaviour is investigated, including the spatial distribution of the pressure coefficients and their discrepancies with respect to experimental results. The effects of some numerical parameters are investigated for this case to understand their effectiveness in modifying the obtained numerical results. The second case is using LES for investigating the wind effects on a real high-rise building, aiming at validating the performance of LES as a design tool in practical applications. The numerical results are validated with the experimental results in terms of the distribution of the pressure statistics and the global forces. The mesh sensitivity and the computational cost are discussed. The third case is using LES for studying the wind effects on the new large-span roof over the Bologna stadium. The dynamic responses are analyzed and design envelopes for the structure are obtained. Although it is a numerical simulation before the traditional wind tunnel tests, i.e. the validation of the numerical results are not performed, the preliminary evaluations can effectively inform later investigations and provide the final design processes with deeper confidence regarding the absence of potentially unexpected behaviours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis explores recent technology developments in the field of structural health monitoring and its application to railway bridge projects. It focuses on two main topics. First, service loads and effect of environmental actions are modelled. In particular, the train moving load and its interaction with rail track is considered with different degrees of detail. Hence, results are compared with real-time experimental measurements. Secondly, the work concerns the identification, definition and modelling process of damages for a prestressed concrete railway bridge, and their implementation inside FEM models. Along with a critical interpretation of the in-field measurements, this approach results in the development of undamaged and damaged databases for the AI-aided detection of anomalies and the definition of threshold levels to prompt automatic alert interventions. In conclusion, an innovative solution for the development of the railway weight-in-motion system is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Recentemente la letteratura scientifica ha dimostrato come un corretto controllo posturale faciliti i movimenti dell’arto superiore. Ci sono evidenze secondo cui, applicando al paziente dei contenimenti sul tronco, si ha un miglioramento della funzionalità dell’arto superiore. Obiettivi L’obiettivo principale della tesi era quello di verificare come il sostegno del tronco con l’utilizzo di una stabile struttura assiale, attraverso un supporto esterno definito “trunk constraint”, incrementi il controllo posturale, per facilitare i movimenti frazionati degli arti superiori in persone con esiti di patologie neurologiche. Materiali e metodi Il caso clinico riguarda un uomo di 60 anni con esiti di emiparesi sinistra da ictus ischemico destro. E’ stato eseguito un protocollo di dieci sessioni di trattamento, di un’ora ciascuna, in cui veniva applicata la facilitazione attraverso trunk constraint in diversi setting riabilitativi. I dati sono stati raccolti tramite le scale: Trunk Control Test, Trunk Impairment Scale e Fugl-Meyer Assessment. Inoltre, è stata eseguita l’analisi osservazionale, attraverso videoripresa, di un gesto funzionale dell’arto superiore. Risultati I dati rilevati dimostrano degli effetti positivi rispetto alle ipotesi di partenza. Infatti sono stati riscontrati miglioramenti negli item delle scale somministrate e nella valutazione qualitativa dell’arto superiore. In particolare, si è evidenziato un miglioramento nel controllo del tronco nella scala Trunk Control Test e nella Trunk Impairment Scale e della funzione dell’arto superiore alla scala Fugl-Meyer Assessment. L’analisi osservazionale dei video dimostra un miglioramento del timing di attivazione durante la fase di reaching. Conclusioni I risultati ottenuti supportano il fatto che un incremento dell’attività antigravitaria del tronco, anche attraverso supporti esterni come la trunk constraint, possono facilitare un miglioramento funzionale dell’arto superiore.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combinatorial decision and optimization problems belong to numerous applications, such as logistics and scheduling, and can be solved with various approaches. Boolean Satisfiability and Constraint Programming solvers are some of the most used ones and their performance is significantly influenced by the model chosen to represent a given problem. This has led to the study of model reformulation methods, one of which is tabulation, that consists in rewriting the expression of a constraint in terms of a table constraint. To apply it, one should identify which constraints can help and which can hinder the solving process. So far this has been performed by hand, for example in MiniZinc, or automatically with manually designed heuristics, in Savile Row. Though, it has been shown that the performances of these heuristics differ across problems and solvers, in some cases helping and in others hindering the solving procedure. However, recent works in the field of combinatorial optimization have shown that Machine Learning (ML) can be increasingly useful in the model reformulation steps. This thesis aims to design a ML approach to identify the instances for which Savile Row’s heuristics should be activated. Additionally, it is possible that the heuristics miss some good tabulation opportunities, so we perform an exploratory analysis for the creation of a ML classifier able to predict whether or not a constraint should be tabulated. The results reached towards the first goal show that a random forest classifier leads to an increase in the performances of 4 different solvers. The experimental results in the second task show that a ML approach could improve the performance of a solver for some problem classes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrochemical hydrogen loading is a technique used to produce and study the hydrogenation in metals starting from a liquid solution containing water. It is a possible alternative to another, well-established technique which loads hydrogen starting from its gas phase. In this work, the electrochemical method is used to understand the fundamental thermodynamics of hydrogen loading in constraint systems such as thin films on substrates, and possibly distinguish the role of interfaces, stresses and microstructure during the hydrogenation process. The systems under study are thin films of Pd, Mg/Pd, and Ti/Mg multilayers. Possible future technological applications may be in the field of hydrogen storage and hydrogen sensors. Towards the end, the experimental setup is modified by introducing an automatic relay. This change leads to improvements in the data analysis and in the attainable information on the kinetics of the systems.