997 resultados para Computer files.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioresorbable polymers such as polylactide (PIA) and polylactide-co-glycolide (PLGA) have been used successfully as biomaterials in a wide range of medical applications. However, their slow degradation rates and propensity to lose strength before mass have caused problems. A central challenge for the development of these materials is the assurance of consistent and predictable in vivo degradation. Previous work has illustrated the potential to influence polymer degradation using electron beam (e-beam) radiation. The work addressed in this paper investigates further the utilisation of e-beam radiation in order to achieve a more surface specific effect. Variation of e-beam energy was studied as a means to control the effective penetrative depth in poly-L-lactide (PLEA). PLEA samples were exposed to e-beam radiation at individual energies of 0.5 MeV, 0.75 MeV and 1.5 MeV. The near-surface region of the PLEA samples was shown to be affected by e-beam irradiation with induced changes in molecular weight, morphology, flexural strength and degradation profile. Moreover, the depth to which the physical properties of the polymer were affected is dependent on the beam energy used. Computer modelling of the transmission of each e-beam energy level used corresponded well with these findings. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

objectives: To describe the patterns of computer use during patient visits to family doctors and to determine whether doctors alter their pattern of computer use in consultations which have significant psychological content.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a feature selection method for data classification, which combines a model-based variable selection technique and a fast two-stage subset selection algorithm. The relationship between a specified (and complete) set of candidate features and the class label is modelled using a non-linear full regression model which is linear-in-the-parameters. The performance of a sub-model measured by the sum of the squared-errors (SSE) is used to score the informativeness of the subset of features involved in the sub-model. The two-stage subset selection algorithm approaches a solution sub-model with the SSE being locally minimized. The features involved in the solution sub-model are selected as inputs to support vector machines (SVMs) for classification. The memory requirement of this algorithm is independent of the number of training patterns. This property makes this method suitable for applications executed in mobile devices where physical RAM memory is very limited. An application was developed for activity recognition, which implements the proposed feature selection algorithm and an SVM training procedure. Experiments are carried out with the application running on a PDA for human activity recognition using accelerometer data. A comparison with an information gain based feature selection method demonstrates the effectiveness and efficiency of the proposed algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A scalable large vocabulary, speaker independent speech recognition system is being developed using Hidden Markov Models (HMMs) for acoustic modeling and a Weighted Finite State Transducer (WFST) to compile sentence, word, and phoneme models. The system comprises a software backend search and an FPGA-based Gaussian calculation which are covered here. In this paper, we present an efficient pipelined design implemented both as an embedded peripheral and as a scalable, parallel hardware accelerator. Both architectures have been implemented on an Alpha Data XRC-5T1, reconfigurable computer housing a Virtex 5 SX95T FPGA. The core has been tested and is capable of calculating a full set of Gaussian results from 3825 acoustic models in 9.03 ms which coupled with a backend search of 5000 words has provided an accuracy of over 80%. Parallel implementations have been designed with up to 32 cores and have been successfully implemented with a clock frequency of 133?MHz.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of accelerators, with compute architectures different and distinct from the CPU, has become a new research frontier in high-performance computing over the past ?ve years. This paper is a case study on how the instruction-level parallelism offered by three accelerator technologies, FPGA, GPU and ClearSpeed, can be exploited in atomic physics. The algorithm studied is the evaluation of two electron integrals, using direct numerical quadrature, a task that arises in the study of intermediate energy electron scattering by hydrogen atoms. The results of our ‘productivity’ study show that while each accelerator is viable, there are considerable differences in the implementation strategies that must be followed on each.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the development of a novel metaheuristic that combines an electromagnetic-like mechanism (EM) and the great deluge algorithm (GD) for the University course timetabling problem. This well-known timetabling problem assigns lectures to specific numbers of timeslots and rooms maximizing the overall quality of the timetable while taking various constraints into account. EM is a population-based stochastic global optimization algorithm that is based on the theory of physics, simulating attraction and repulsion of sample points in moving toward optimality. GD is a local search procedure that allows worse solutions to be accepted based on some given upper boundary or ‘level’. In this paper, the dynamic force calculated from the attraction-repulsion mechanism is used as a decreasing rate to update the ‘level’ within the search process. The proposed method has been applied to a range of benchmark university course timetabling test problems from the literature. Moreover, the viability of the method has been tested by comparing its results with other reported results from the literature, demonstrating that the method is able to produce improved solutions to those currently published. We believe this is due to the combination of both approaches and the ability of the resultant algorithm to converge all solutions at every search process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deformation localisation is the main reason for material failure in cold forging of titanium alloys and is thus closely related to the production yield of cold forging. In the study of the influence of process parameters on dynamic compression, considering material constitutive behaviour, physical parameters and process parameters, a numerical dynamic compression model for titanium alloys has been constructed. By adjusting the process parameters, the severity of strain localisation and stress state in the localised zone can be controlled thus enhancing the compression performance of titanium alloys.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in surgical procedure, prosthesis design, and biomaterials performance have considerably increased the longevity of total joint replacements. Preoperative planning is another step in joint replacement that may have the potential to improve clinical outcome for the individual patient, but has remained relatively consistent for a longtime. One means of advancing this aspect of joint replacement surgery may be to include predictive computer simulation into the planning process. In this article, the potential of patient-specific finite element analysis in preoperative assessment is investigated. Seventeen patient-specific finite element models of cemented Charnley reconstructions were created, of which six were early (