795 resultados para Slot-based task-splitting algorithms
Resumo:
La partición hardware/software es una etapa clave dentro del proceso de co-diseño de los sistemas embebidos. En esta etapa se decide qué componentes serán implementados como co-procesadores de hardware y qué componentes serán implementados en un procesador de propósito general. La decisión es tomada a partir de la exploración del espacio de diseño, evaluando un conjunto de posibles soluciones para establecer cuál de estas es la que mejor balance logra entre todas las métricas de diseño. Para explorar el espacio de soluciones, la mayoría de las propuestas, utilizan algoritmos metaheurísticos; destacándose los Algoritmos Genéticos, Recocido Simulado. Esta decisión, en muchos casos, no es tomada a partir de análisis comparativos que involucren a varios algoritmos sobre un mismo problema. En este trabajo se presenta la aplicación de los algoritmos: Escalador de Colinas Estocástico y Escalador de Colinas Estocástico con Reinicio, para resolver el problema de la partición hardware/software. Para validar el empleo de estos algoritmos se presenta la aplicación de este algoritmo sobre un caso de estudio, en particular la partición hardware/software de un codificador JPEG. En todos los experimentos es posible apreciar que ambos algoritmos alcanzan soluciones comparables con las obtenidas por los algoritmos utilizados con más frecuencia.
Resumo:
Stereotypies are abnormal repetitive behaviour patterns that are highly prevalent in laboratory mice and are thought to reflect impaired welfare. Thus, they are associated with impaired behavioural inhibition and may also reflect negative affective states. However, in mice the relationship between stereotypies and behavioural inhibition is inconclusive, and reliable measures of affective valence are lacking. Here we used an exploration based task to assess cognitive bias as a measure of affective valence and a two-choice guessing task to assess recurrent perseveration as a measure of impaired behavioural inhibition to test mice with different forms and expression levels of stereotypic behaviour. We trained 44 CD- 1 and 40 C57BL/6 female mice to discriminate between positively and negatively cued arms in a radial maze and tested their responses to previously inaccessible ambiguous arms. In CD-1 mice (i) mice with higher stereotypy levels displayed a negative cognitive bias and this was influenced by the form of stereotypy performed, (ii) negative cognitive bias was evident in back-flipping mice, and (iii) no such effect was found in mice displaying bar-mouthing or cage-top twirling. In C57BL/6 mice neither route-tracing nor bar-mouthing was associated with cognitive bias, indicating that in this strain these stereotypies may not reflect negative affective states. Conversely, while we found no relation of stereotypy to recurrent perseveration in CD-1 mice, C57BL/6 mice with higher levels of route-tracing, but not bar-mouthing made more repetitive responses in the guessing task. Our findings confirm previous research indicating that the implications of stereotypies for animal welfare may strongly depend on the species and strain of animal as well as on the form and expression level of the stereotypy. Furthermore, they indicate that variation in stereotypic behaviour may represent an important source of variation in many animal experiments.
Resumo:
Andrews and Curtis conjectured in 1965 that every balanced presentation of the trivial group can be transformed into a standard presentation by a finite sequence of elementary transformations. Recent computational work by Miasnikov and Myasnikov on this problem has been based on genetic algorithms. We show that a computational attack based on a breadth-first search of the tree of equivalent presentations is also viable, and seems to outperform that based on genetic algorithms. It allows us to extract shorter proofs (in some cases, provably shortest) and to consider the length thirteen case for two generators. We prove that, up to equivalence, there is a unique minimum potential counterexample.
Resumo:
The n-tuple pattern recognition method has been tested using a selection of 11 large data sets from the European Community StatLog project, so that the results could be compared with those reported for the 23 other algorithms the project tested. The results indicate that this ultra-fast memory-based method is a viable competitor with the others, which include optimisation-based neural network algorithms, even though the theory of memory-based neural computing is less highly developed in terms of statistical theory.
Resumo:
The study of gender differences in prospective memory (i.e., remembering to remember) has received modest attention in the literature. The few reported studies investigating either subjective or objective evaluations of prospective memory have shown inconsistent data. In this study, we aimed to verify the presence of gender differences during the performance of an objective prospective memory test by considering the weight of specific variables such as length of delay, type of response, and type of cue. We submitted a sample of 100 healthy Italian participants (50 men and 50 women) to a test expressly developed to assess prospective memory: The Memory for Intentions Screening Test. Women performed better than men in remembering to do an event-based task (i.e., prompted by an external event) and when the task required a physical response modality. We discuss the behavioural differences that emerged by considering the possible role of sociological, biological, neuroanatomical, and methodological variables.
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
This dissertation presents a system-wide approach, based on genetic algorithms, for the optimization of transfer times for an entire bus transit system. Optimization of transfer times in a transit system is a complicated problem because of the large set of binary and discrete values involved. The combinatorial nature of the problem imposes a computational burden and makes it difficult to solve by classical mathematical programming methods. ^ The genetic algorithm proposed in this research attempts to find an optimal solution for the transfer time optimization problem by searching for a combination of adjustments to the timetable for all the routes in the system. It makes use of existing scheduled timetables, ridership demand at all transfer locations, and takes into consideration the randomness of bus arrivals. ^ Data from Broward County Transit are used to compute total transfer times. The proposed genetic algorithm-based approach proves to be capable of producing substantial time savings compared to the existing transfer times in a reasonable amount of time. ^ The dissertation also addresses the issues related to spatial and temporal modeling, variability in bus arrival and departure times, walking time, as well as the integration of scheduling and ridership data. ^
Resumo:
This thesis presents a hybrid technique of frequency selective surfaces project (FSS) on a isotropic dielectric layer, considering various geometries for the elements of the unit cell. Specifically, the hybrid technique uses the equivalent circuit method in conjunction with genetic algorithm, aiming at the synthesis of structures with response single-band and dual-band. The equivalent circuit method allows you to model the structure by using an equivalent circuit and also obtaining circuits for different geometries. From the obtaining of the parameters of these circuits, you can get the transmission and reflection characteristics of patterned structures. For the optimization of patterned structures, according to the desired frequency response, Matlab™ optimization tool named optimtool proved to be easy to use, allowing you to explore important results on the optimization analysis. In this thesis, numeric and experimental results are presented for the different characteristics of the analyzed geometries. For this, it was determined a technique to obtain the parameter N, which is based on genetic algorithms and differential geometry, to obtain the algebraic rational models that determine values of N more accurate, facilitating new projects of FSS with these geometries. The optimal results of N are grouped according to the occupancy factor of the cell and the thickness of the dielectric, for modeling of the structures by means of rational algebraic equations. Furthermore, for the proposed hybrid model was developed a fitness function for the purpose of calculating the error occurred in the definitions of FSS bandwidths with transmission features single band and dual band. This thesis deals with the construction of prototypes of FSS with frequency settings and band widths obtained with the use of this function. The FSS were initially reviewed through simulations performed with the commercial software Ansoft Designer ™, followed by simulation with the equivalent circuit method for obtaining a value of N in order to converge the resonance frequency and the bandwidth of the FSS analyzed, then the results obtained were compared. The methodology applied is validated with the construction and measurement of prototypes with different geometries of the cells of the arrays of FSS.
Resumo:
This thesis presents a hybrid technique of frequency selective surfaces project (FSS) on a isotropic dielectric layer, considering various geometries for the elements of the unit cell. Specifically, the hybrid technique uses the equivalent circuit method in conjunction with genetic algorithm, aiming at the synthesis of structures with response single-band and dual-band. The equivalent circuit method allows you to model the structure by using an equivalent circuit and also obtaining circuits for different geometries. From the obtaining of the parameters of these circuits, you can get the transmission and reflection characteristics of patterned structures. For the optimization of patterned structures, according to the desired frequency response, Matlab™ optimization tool named optimtool proved to be easy to use, allowing you to explore important results on the optimization analysis. In this thesis, numeric and experimental results are presented for the different characteristics of the analyzed geometries. For this, it was determined a technique to obtain the parameter N, which is based on genetic algorithms and differential geometry, to obtain the algebraic rational models that determine values of N more accurate, facilitating new projects of FSS with these geometries. The optimal results of N are grouped according to the occupancy factor of the cell and the thickness of the dielectric, for modeling of the structures by means of rational algebraic equations. Furthermore, for the proposed hybrid model was developed a fitness function for the purpose of calculating the error occurred in the definitions of FSS bandwidths with transmission features single band and dual band. This thesis deals with the construction of prototypes of FSS with frequency settings and band widths obtained with the use of this function. The FSS were initially reviewed through simulations performed with the commercial software Ansoft Designer ™, followed by simulation with the equivalent circuit method for obtaining a value of N in order to converge the resonance frequency and the bandwidth of the FSS analyzed, then the results obtained were compared. The methodology applied is validated with the construction and measurement of prototypes with different geometries of the cells of the arrays of FSS.
Resumo:
Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.
Resumo:
Prior work of our research group, that quantified the alarming levels of radiation dose to patients with Crohn’s disease from medical imaging and the notable shift towards CT imaging making these patients an at risk group, provided context for this work. CT delivers some of the highest doses of ionising radiation in diagnostic radiology. Once a medical imaging examination is deemed justified, there is an onus on the imaging team to endeavour to produce diagnostic quality CT images at the lowest possible radiation dose to that patient. The fundamental limitation with conventional CT raw data reconstruction was the inherent coupling of administered radiation dose with observed image noise – the lower the radiation dose, the noisier the image. The renaissance, rediscovery and refinement of iterative reconstruction removes this limitation allowing either an improvement in image quality without increasing radiation dose or maintenance of image quality at a lower radiation dose compared with traditional image reconstruction. This thesis is fundamentally an exercise in optimisation in clinical CT practice with the objectives of assessment of iterative reconstruction as a method for improvement of image quality in CT, exploration of the associated potential for radiation dose reduction, and development of a new split dose CT protocol with the aim of achieving and validating diagnostic quality submillisiever t CT imaging in patients with Crohn’s disease. In this study, we investigated the interplay of user-selected parameters on radiation dose and image quality in phantoms and cadavers, comparing traditional filtered back projection (FBP) with iterative reconstruction algorithms. This resulted in the development of an optimised, refined and appropriate split dose protocol for CT of the abdomen and pelvis in clinical patients with Crohn’s disease allowing contemporaneous acquisition of both modified and conventional dose CT studies. This novel algorithm was then applied to 50 patients with a suspected acute complication of known Crohn’s disease and the raw data reconstructed with FBP, adaptive statistical iterative reconstruction (ASiR) and model based iterative reconstruction (MBIR). Conventional dose CT images with FBP reconstruction were used as the reference standard with which the modified dose CT images were compared in terms of radiation dose, diagnostic findings and image quality indices. As there are multiple possible user-selected strengths of ASiR available, these were compared in terms of image quality to determine the optimal strength for this modified dose CT protocol. Modified dose CT images with MBIR were also compared with contemporaneous abdominal radiograph, where performed, in terms of diagnostic yield and radiation dose. Finally, attenuation measurements in organs, tissues, etc. with each reconstruction algorithm were compared to assess for preservation of tissue characterisation capabilities. In the phantom and cadaveric models, both forms of iterative reconstruction examined (ASiR and MBIR) were superior to FBP across a wide variety of imaging protocols, with MBIR superior to ASiR in all areas other than reconstruction speed. We established that ASiR appears to work to a target percentage noise reduction whilst MBIR works to a target residual level of absolute noise in the image. Modified dose CT images reconstructed with both ASiR and MBIR were non-inferior to conventional dose CT with FBP in terms of diagnostic findings, despite reduced subjective and objective indices of image quality. Mean dose reductions of 72.9-73.5% were achieved with the modified dose protocol with a mean effective dose of 1.26mSv. MBIR was again demonstrated superior to ASiR in terms of image quality. The overall optimal ASiR strength for the modified dose protocol used in this work is ASiR 80%, as this provides the most favourable balance of peak subjective image quality indices with less objective image noise than the corresponding conventional dose CT images reconstructed with FBP. Despite guidelines to the contrary, abdominal radiographs are still often used in the initial imaging of patients with a suspected complication of Crohn’s disease. We confirmed the superiority of modified dose CT with MBIR over abdominal radiographs at comparable doses in detection of Crohn’s disease and non-Crohn’s disease related findings. Finally, we demonstrated (in phantoms, cadavers and in vivo) that attenuation values do not change significantly across reconstruction algorithms meaning preserved tissue characterisation capabilities with iterative reconstruction. Both adaptive statistical and model based iterative reconstruction algorithms represent feasible methods of facilitating acquisition diagnostic quality CT images of the abdomen and pelvis in patients with Crohn’s disease at markedly reduced radiation doses. Our modified dose CT protocol allows dose savings of up to 73.5% compared with conventional dose CT, meaning submillisievert imaging is possible in many of these patients.
Resumo:
Background: Potentially inappropriate prescribing (PIP) is common in older people in primary care and can result in increased morbidity, adverse drug events and hospitalisations. We previously demonstrated the success of a multifaceted intervention in decreasing PIP in primary care in a cluster randomised controlled trial (RCT).
Objective: We sought to determine whether the improvement in PIP in the short term was sustained at 1-year follow-up.
Methods: A cluster RCT was conducted with 21 GP practices and 196 patients (aged ≥70) with PIP in Irish primary care. Intervention participants received a complex multifaceted intervention incorporating academic detailing, medicine review with web-based pharmaceutical treatment algorithms that provide recommended alternative treatment options, and tailored patient information leaflets. Control practices delivered usual care and received simple, patient-level PIP feedback. Primary outcomes were the proportion of patients with PIP and the mean number of potentially inappropriate prescriptions at 1-year follow-up. Intention-to-treat analysis using random effects regression was used.
Results: All 21 GP practices and 186 (95 %) patients were followed up. We found that at 1-year follow-up, the significant reduction in the odds of PIP exposure achieved during the intervention was sustained after its discontinuation (adjusted OR 0.28, 95 % CI 0.11 to 0.76, P = 0.01). Intervention participants had significantly lower odds of having a potentially inappropriate proton pump inhibitor compared to controls (adjusted OR 0.40, 95 % CI 0.17 to 0.94, P = 0.04).
Conclusion: The significant reduction in the odds of PIP achieved during the intervention was sustained after its discontinuation. These results indicate that improvements in prescribing quality can be maintained over time.
Resumo:
Background
The OPTI-SCRIPT cluster randomised controlled trial (RCT) found that a three-phase multifaceted intervention including academic detailing with a pharmacist, GP-led medicines reviews, supported by web-based pharmaceutical treatment algorithms, and tailored patient information leaflets, was effective in reducing potentially inappropriate prescribing (PIP) in Irish primary care. We report a process evaluation exploring the implementation of the intervention, the experiences of those participating in the study and lessons for future implementation.
Methods
The OPTI-SCRIPT trial included 21 GP practices and 196 patients. The process evaluation used mixed methods. Quantitative data were collected from all GP practices and semi-structured interviews were conducted with GPs from intervention and control groups, and a purposive sample of patients from the intervention group. All interviews were transcribed verbatim and analysed using a thematic analysis.
Results
Despite receiving a standardised academic detailing session, intervention delivery varied among GP practices. Just over 70 % of practices completed medicines review as recommended with the patient present. Only single-handed practices conducted reviews without patients present, highlighting the influence of practice characteristics and resources on variation. Medications were more likely to be completely stopped or switched to another more appropriate medication when reviews were conducted with patients present. The patient information leaflets were not used by any of the intervention practices. Both GP (32 %) and patient (40 %) recruitment rates were modest. For those who did participate, overall, the experience was positively viewed, with GPs and patients referring to the value of medication reviews to improve prescribing and reduce unnecessary medications. Lack of time in busy GP practices and remuneration were identified as organisational barriers to future implementation.
Conclusions
The OPTI-SCRIPT intervention was positively viewed by both GPs and patients, both of whom valued the study’s objectives. Patient information leaflets were not a successful component of the intervention. Academic detailing and medication reviews are important components in changing PIP, and having patients present during the review process seems to be a more effective approach for decreasing PIP.
Resumo:
O problema de planejamento de rotas de robôs móveis consiste em determinar a melhor rota para um robô, em um ambiente estático e/ou dinâmico, que seja capaz de deslocá-lo de um ponto inicial até e um ponto final, também em conhecido como estado objetivo. O presente trabalho emprega o uso de uma abordagem baseada em Algoritmos Genéticos para o planejamento de rotas de múltiplos robôs em um ambiente complexo composto por obstáculos fixos e obstáculos moveis. Através da implementação do modelo no software do NetLogo, uma ferramenta utilizada em simulações de aplicações multiagentes, possibilitou-se a modelagem de robôs e obstáculos presentes no ambiente como agentes interativos, viabilizando assim o desenvolvimento de processos de detecção e desvio de obstáculos. A abordagem empregada busca pela melhor rota para robôs e apresenta um modelo composto pelos operadores básicos de reprodução e mutação, acrescido de um novo operador duplo de refinamento capaz de aperfeiçoar as melhores soluções encontradas através da eliminação de movimentos inúteis. Além disso, o calculo da rota de cada robô adota um método de geração de subtrechos, ou seja, não calcula apenas uma unica rota que conecta os pontos inicial e final do cenário, mas sim várias pequenas subrotas que conectadas formam um caminho único capaz de levar o robô ao estado objetivo. Neste trabalho foram desenvolvidos dois cenários, para avaliação da sua escalabilidade: o primeiro consiste em um cenário simples composto apenas por um robô, um obstáculo movel e alguns obstáculos fixos; já o segundo, apresenta um cenário mais robusto, mais amplo, composto por múltiplos robôs e diversos obstáculos fixos e moveis. Ao final, testes de desempenho comparativos foram efetuados entre a abordagem baseada em Algoritmos Genéticos e o Algoritmo A*. Como critério de comparação foi utilizado o tamanho das rotas obtidas nas vinte simulações executadas em cada abordagem. A analise dos resultados foi especificada através do Teste t de Student.
Resumo:
Abstract : Many individuals that had a stroke have motor impairments such as timing deficits that hinder their ability to complete daily activities like getting dressed. Robotic rehabilitation is an increasingly popular therapeutic avenue in order to improve motor recovery among this population. Yet, most studies have focused on improving the spatial aspect of movement (e.g. reaching), and not the temporal one (e.g. timing). Hence, the main aim of this study was to compare two types of robotic rehabilitation on the immediate improvement of timing accuracy: haptic guidance (HG), which consists of guiding the person to make the correct movement, and thus decreasing his or her movement errors, and error amplification (EA), which consists of increasing the person’s movement errors. The secondary objective consisted of exploring whether the side of the stroke lesion had an effect on timing accuracy following HG and EA training. Thirty-four persons that had a stroke (average age 67 ± 7 years) participated in a single training session of a timing-based task (simulated pinball-like task), where they had to activate a robot at the correct moment to successfully hit targets that were presented a random on a computer screen. Participants were randomly divided into two groups, receiving either HG or EA. During the same session, a baseline phase and a retention phase were given before and after each training, and these phases were compared in order to evaluate and compare the immediate impact of HG and EA on movement timing accuracy. The results showed that HG helped improve the immediate timing accuracy (p=0.03), but not EA (p=0.45). After comparing both trainings, HG was revealed to be superior to EA at improving timing (p=0.04). Furthermore, a significant correlation was found between the side of stroke lesion and the change in timing accuracy following EA (r[subscript pb]=0.7, p=0.001), but not HG (r[subscript pb]=0.18, p=0.24). In other words, a deterioration in timing accuracy was found for participants with a lesion in the left hemisphere that had trained with EA. On the other hand, for the participants having a right-sided stroke lesion, an improvement in timing accuracy was noted following EA. In sum, it seems that HG helps improve the immediate timing accuracy for individuals that had a stroke. Still, the side of the stroke lesion seems to play a part in the participants’ response to training. This remains to be further explored, in addition to the impact of providing more training sessions in order to assess any long-term benefits of HG or EA.