970 resultados para computer algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of the trainability of executive functions and the impact of such training on related cognitive skills has stirred considerable research interest. Despite a number of studies investigating this, the question has not yet been solved. The general aim of this thesis was to investigate two very different types of training of executive functions: laboratory-based computerized training (Studies I-III) and realworld training through bilingualism (Studies IV-V). Bilingualism as a kind of training of executive functions is based on the idea that managing two languages requires executive resources, and previous studies have suggested a bilingual advantage in executive functions. Three executive functions were studied in the present thesis: updating of working memory (WM) contents, inhibition of irrelevant information, and shifting between tasks and mental sets. Studies I-III investigated the effects of computer-based training of WM updating (Study I), inhibition (Study II), and set shifting (Study III) in healthy young adults. All studies showed increased performance on the trained task. More importantly, improvement on an untrained task tapping the trained executive function (near transfer) was seen in Study I and II. None of the three studies showed improvement on untrained tasks tapping some other cognitive function (far transfer) as a result of training. Study I also used PET to investigate the effects of WM updating training on a neurotransmitter closely linked to WM, namely dopamine. The PET results revealed increased striatal dopamine release during WM updating performance as a result of training. Study IV investigated the ability to inhibit task-irrelevant stimuli in bilinguals and monolinguals by using a dichotic listening task. The results showed that the bilinguals exceeded the monolinguals in inhibiting task-irrelevant information. Study V introduced a new, complementary research approach to study the bilingual executive advantage and its underlying mechanisms. To circumvent the methodological problems related to natural groups design, this approach focuses only on bilinguals and examines whether individual differences in bilingual behavior correlate with executive task performances. Using measures that tap the three above-entioned executive functions, the results suggested that more frequent language switching was associated with better set shifting skills, and earlier acquisition of the second language was related to better inhibition skills. In conclusion, the present behavioral results showed that computer-based training of executive functions can improve performance on the trained task and on closely related tasks, but does not yield a more general improvement of cognitive skills. Moreover, the functional neuroimaging results reveal that WM training modulates striatal dopaminergic function, speaking for training-induced neural plasticity in this important neurotransmitter system. With regard to bilingualism, the results provide further support to the idea that bilingualism can enhance executive functions. In addition, the new complementary research approach proposed here provides some clues as to which aspects of everyday bilingual behavior may be related to the advantage in executive functions in bilingual individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

JÄKÄLA-algoritmi (Jatkuvan Äänitehojakautuman algoritmi Käytävien Äänikenttien LAskentaan) ja sen NUMO- ja APPRO-laskentayhtälöt perustuvat käytävällä olevan todellisen äänilähteen kuvalähteiden symmetriaan. NUMO on algoritmin numeerisen ratkaisun ja APPRO likiarvoratkaisun laskentayhtälö. Algoritmia johdettaessa oletettiin, että absorptiomateriaali oli jakautunut tasaisesti käytävän ääntä heijastaville pinnoille. Suorakaiteen muotoisen käytävän kuvalähdetason muunto jatkuvaksi äänitehojakautumaksi sisältää kolme muokkausvaihetta. Aluksi suorakaiteen kuvalähdetaso muunnetaan neliön muotoiseksi. Seuraavaksi neliön muotoisen kuvalähdetason samanarvoiset kuvalähteet siirretään koordinaattiakselille diskreetiksi kuvalähdejonoksi. Lopuksi kuvalähdejono muunnetaan jatkuvaksi äänitehojakautumaksi, jolloin käytävän vastaanottopisteen äänenpainetaso voidaan laskea integroimalla jatkuvan äänitehojakautuman yli. JÄKÄLA-algoritmin validiteetin toteamiseksi käytettiin testattua kaupallista AKURI-ohjelmaa. AKURI-ohjelma antoi myös hyvän käsityksen siitä, miten NUMO- ja APPRO-yhtälöillä lasketut arvot mahdollisesti eroavat todellisilla käytävillä mitatuista arvoista. JÄKÄLA-algoritmin NUMO- ja APPRO-yhtälöitä testattiin myös vertaamalla niiden antamia tuloksia kolmen erityyppisen käytävän äänenpainetasomittauksiin. Tässä tutkimuksessa on osoitettu, että akustisen kuvateorian pohjalta on mahdollista johtaa laskenta-algoritmi, jota voidaan soveltaa pitkien käytävien äänikenttien pika-arvioinnissa paikan päällä. Sekä teoreettinen laskenta että käytännön äänenpainetasomittaukset todellisilla käytävillä osoittivat, että JÄKÄLA-algoritmin yhtälöiden ennustustarkkuus oli erinomainen ideaalikäytävillä ja hyvä niillä todellisilla käytävillä, joilla ei ollut ääntä heijastavia rakenteita. NUMO- ja APPRO-yhtälöt näyttäisivät toimivan hyvin käytävillä, joiden poikkileikkaus oli lähes neliön muotoinen ja joissa pintojen suurin absorptiokerroin oli korkeintaan kymmenen kertaa pienintä absorptiokerrointa suurempi. NUMO- ja APPRO-yhtälöiden suurin puute on, etteivät ne ota huomioon pintojen erilaisia absorptiokertoimia eivätkä esineistä heijastuvia ääniä. NUMO- ja APPRO- laskentayhtälöt poikkesivat mitatuista arvoista eniten käytävillä, joilla kahden vastakkaisen pinnan absorptiokerroin oli hyvin suuri ja toisen pintaparin hyvin pieni, ja käytävillä, joissa oli massiivisia, ääntä heijastavia pilareita ja palkkeja. JÄKÄLA-algoritmin NUMO- ja APPRO-yhtälöt antoivat tutkituilla käytävillä kuitenkin selvästi tarkempia arvoja kuin Kuttruffin likiarvoyhtälö ja tilastollisen huoneakustiikan perusyhtälö. JÄKÄLA-algoritmin laskentatarkkuutta on testattu vain neljällä todellisella käytävällä. Algoritmin kehittämiseksi tulisi jatkossa käytävän vastakkaisia pintoja ja niiden absorptiokertoimia käsitellä laskennassa pareittain. Algoritmin validiteetin varmistamiseksi on mittauksia tehtävä lisää käytävillä, joiden absorptiomateriaalien jakautumat poikkeavat toisistaan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Communication, the flow of ideas and information between individuals in a social context, is the heart of educational experience. Constructivism and constructivist theories form the foundation for the collaborative learning processes of creating and sharing meaning in online educational contexts. The Learning and Collaboration in Technology-enhanced Contexts (LeCoTec) course comprised of 66 participants drawn from four European universities (Oulu, Turku, Ghent and Ramon Llull). These participants were split into 15 groups with the express aim of learning about computer-supported collaborative learning (CSCL). The Community of Inquiry model (social, cognitive and teaching presences) provided the content and tools for learning and researching the collaborative interactions in this environment. The sampled comments from the collaborative phase were collected and analyzed at chain-level and group-level, with the aim of identifying the various message types that sustained high learning outcomes. Furthermore, the Social Network Analysis helped to view the density of whole group interactions, as well as the popular and active members within the highly collaborating groups. It was observed that long chains occur in groups having high quality outcomes. These chains were also characterized by Social, Interactivity, Administrative and Content comment-types. In addition, high outcomes were realized from the high interactive cases and high-density groups. In low interactive groups, commenting patterned around the one or two central group members. In conclusion, future online environments should support high-order learning and develop greater metacognition and self-regulation. Moreover, such an environment, with a wide variety of problem solving tools, would enhance interactivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the intersection curve between Bézier Surfaces may be seen as the composition of two separated problems: determining initial points and tracing the intersection curve from these points. The Bézier Surface is represented by a parametric function (polynomial with two variables) that maps a point in the tridimensional space from the bidimensional parametric space. In this article, it is proposed an algorithm to determine the initial points of the intersection curve of Bézier Surfaces, based on the solution of polynomial systems with the Projected Polyhedral Method, followed by a method for tracing the intersection curves (Marching Method with differential equations). In order to allow the use of the Projected Polyhedral Method, the equations of the system must be represented in terms of the Bernstein basis, and towards this goal it is proposed a robust and reliable algorithm to exactly transform a multivariable polynomial in terms of power basis to a polynomial written in terms of Bernstein basis .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this work is to analyze the importance of the gas-solid interface transfer of the kinetic energy of the turbulent motion on the accuracy of prediction of the fluid dynamic of Circulating Fluidized Bed (CFB) reactors. CFB reactors are used in a variety of industrial applications related to combustion, incineration and catalytic cracking. In this work a two-dimensional fluid dynamic model for gas-particle flow has been used to compute the porosity, the pressure, and the velocity fields of both phases in 2-D axisymmetrical cylindrical co-ordinates. The fluid dynamic model is based on the two fluid model approach in which both phases are considered to be continuous and fully interpenetrating. CFB processes are essentially turbulent. The model of effective stress on each phase is that of a Newtonian fluid, where the effective gas viscosity was calculated from the standard k-epsilon turbulence model and the transport coefficients of the particulate phase were calculated from the kinetic theory of granular flow (KTGF). This work shows that the turbulence transfer between the phases is very important for a better representation of the fluid dynamics of CFB reactors, especially for systems with internal recirculation and high gradients of particle concentration. Two systems with different characteristics were analyzed. The results were compared with experimental data available in the literature. The results were obtained by using a computer code developed by the authors. The finite volume method with collocated grid, the hybrid interpolation scheme, the false time step strategy and SIMPLEC (Semi-Implicit Method for Pressure Linked Equations - Consistent) algorithm were used to obtain the numerical solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present an algorithm for the numerical simulation of the cavitation in the hydrodynamic lubrication of journal bearings. Despite the fact that this physical process is usually modelled as a free boundary problem, we adopted the equivalent variational inequality formulation. We propose a two-level iterative algorithm, where the outer iteration is associated to the penalty method, used to transform the variational inequality into a variational equation, and the inner iteration is associated to the conjugate gradient method, used to solve the linear system generated by applying the finite element method to the variational equation. This inner part was implemented using the element by element strategy, which is easily parallelized. We analyse the behavior of two physical parameters and discuss some numerical results. Also, we analyse some results related to the performance of a parallel implementation of the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crack formation and growth in steel bridge structural elements may be due to loading oscillations. The welded elements are liable to internal discontinuities along welded joints and sensible to stress variations. The evaluation of the remaining life of a bridge is needed to make cost-effective decisions regarding inspection, repair, rehabilitation, and replacement. A steel beam model has been proposed to simulate crack openings due to cyclic loads. Two possible alternatives have been considered to model crack propagation, which the initial phase is based on the linear fracture mechanics. Then, the model is extended to take into account the elastoplastic fracture mechanic concepts. The natural frequency changes are directly related to moment of inertia variation and consequently to a reduction in the flexural stiffness of a steel beam. Thus, it is possible to adopt a nondestructive technique during steel bridge inspection to quantify the structure eigenvalue variation that will be used to localize the grown fracture. A damage detection algorithm is developed for the proposed model and the numerical results are compared with the solutions achieved by using another well know computer code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work present the application of a computer package for generating of projection data for neutron computerized tomography, and in second part, discusses an application of neutron tomography, using the projection data obtained by Monte Carlo technique, for the detection and localization of light materials such as those containing hydrogen, concealed by heavy materials such as iron and lead. For tomographic reconstructions of the samples simulated use was made of only six equal projection angles distributed between 0º and 180º, with reconstruction making use of an algorithm (ARIEM), based on the principle of maximum entropy. With the neutron tomography it was possible to detect and locate polyethylene and water hidden by lead and iron (with 1cm-thick). Thus, it is demonstrated that thermal neutrons tomography is a viable test method which can provide important interior information about test components, so, extremely useful in routine industrial applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dental injuries are common and the incidence of maxillofacial injuries has increased over the recent decades in Finland. Accidental injuries are the global leading cause of death among children over the age of one year and among adults under the age of 40 globally. Significant resources and costs are needed for the treatment of these patients. The prevention is the most economical way to reduce trauma rates and costs. For the prevention it is crucial to know the prevalences, incidences and risk factors related to injuries. To improve the quality of treatment, it is essential to explore the causes, trauma mechanisms and management of trauma. The above mentioned was the aim of this thesis. With a large epidemiological cohort study (5737 participants) it was possible to estimate lifetime prevalence of and risk factors for dental trauma in general population (Study I). The prevalence of dental fractures was 43% and the prevalence of dental luxations and avulsions was 14%. Male gender, a history of previous non-dental injuries, mental distress, overweight and high alcohol consumption were positively associated with the occurrence of dental injuries Study II was conducted to explore the differences in type and multiplicity of mandibular fractures in three different countries (Canada, Finland and Kuwait). This retrospective study showed that the differences in mandibular fracture multiplicity and location are based on different etiologies and demographic patterns. This data can be exploited for planning of measures to prevent traumatic facial fractures. The etiology, management and outcome of 63 pediatric skull base fracture (Study III) and 20 pediatric frontobasal fracture patients (Study IV) were explored. These retrospective studies showed that, both skull base fracture and frontobasa fracture are rare injuries in childhood and although intracranial injuries and morbidity are frequent, permanent neurological or neuropsychological deficits are infrequent. A systematic algorithm (Study V) for computer tomography (CT) image review was aimed at clinicians and radiologists to improve the assessment of patients with complex upper midface and cranial base trauma. The cohort study was cross sectional and data was collected in the Turku and Oulu University Hospitals. A novel image-reviewing algorithm was created to enhance the specificity of CT for the diagnosis of frontobasal fractures. The study showed that an image-viewing algorithm standardizes the frontobasal trauma detection procedure and leads to better control and assessment. The purpose of the retrospective subcranial craniotomy study (VI) was to review the types of frontobasal fractures and their management, complications and outcome when the fracture is approached subcranially. The subcranial approach appears to be successful and have a reasonably low complication rate. It may be recommended as the technique of choice in multiple and the most complicated frontal base fractures where the endoscopic endonasal approach is not feasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of water-sensitive papers is an important tool for assessing the quality of pesticide application on crops, but manual analysis is laborious and time-consuming. Thus, this study aimed to evaluate and compare the results obtained from four software programs for spray droplet analysis in different scanned images of water-sensitive papers. After spraying, papers with four droplet deposition patterns (varying droplet spectra and densities) were analyzed manually and by means of the following computer programs: CIR, e-Sprinkle, DepositScan and Conta-Gotas. The diameter of the volume and number medians and the number of droplets per target area were studied. There is a strong correlation between the values measured using the different programs and the manual analysis, but there is a great difference between the numerical values measured for the same paper. Thus, it is not advisable to compare results obtained from different programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkimuksessa selvitettiin, kuinka hyvä tekoäly tietokonepeliin on mahdollista toteuttaa nykytiedolla ja -tekniikalla. Tekoäly rajattiin tarkoittamaan tekoälyn ohjaamia pelihahmoja. Lisäksi yksinkertaisia tekoälytoteutuksia ei huomioitu. Työ toteutettiin tutustumalla aiheeseen liittyvään kirjallisuuteen sekä kehittäjäyhteisön web-sivustojen tietoon. Hyvän tekoälyn kriteereiksi valikoituivat viihdyttävyys ja uskottavuus. Katsaus suosituimpiin toteuttamistekniikoihin ja tekoälyn mahdollisuuksiin osoitti, että teoriassa hyvinkin edistynyt tekoäly on toteutettavissa. Käytännössä tietokoneen rajalliset resurssit, kehittäjien rajalliset taidot ja pelinkehitysprojektien asettamat vaatimukset näyttävät kuitenkin rajoittavan tekoälyn toteuttamista kaupallisessa tuotteessa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to measure full epidermal thickness, stratum corneum thickness, rete length, dermal papilla widening and suprapapillary epidermal thickness in psoriasis patients using a light microscope and computer-supported image analysis. The data obtained were analyzed in terms of patient age, type of psoriasis, total body surface area involvement, scalp and nail involvement, duration of psoriasis, and family history of the disease. The study was conducted on 64 patients and 57 controls whose skin biopsies were examined by light microscopy. The acquired microscopic images were transferred to a computer and measurements were made using image analysis. The skin biopsies, taken from different body areas, were examined for different parameters such as epidermal, corneal and suprapapillary epidermal thickness. The most prominent increase in thickness was detected in the palmar region. Corneal thickness was more pronounced in patients with scalp involvement than in patients without scalp involvement (t = -2.651, P = 0.008). The most prominent increase in rete length was observed in the knees (median: 491 µm, t = 10.117, P = 0.000). The difference in rete length between patients with a positive and a negative family history was significant (t = -3.334, P = 0.03), being 27% greater in psoriasis patients without a family history. The differences in dermal papilla distances among patients were very small. We conclude that microscope-supported thickness measurements provide objective results.