996 resultados para automated meter reading (AMR)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional motion capture techniques, for instance, those employing optical technology, have long been used in the area of rehabilitation, sports medicine and performance analysis, where accurately capturing bio-mechanical data is of crucial importance. However their size, cost, complexity and lack of portability mean that their use is often impractical. Low cost MEMS inertial sensors when combined and assembled into a Wireless Inertial Measurement Unit (WIMU) present a possible solution for low cost and highly portable motion capture. However due to the large variability inherent to MEMS sensors, such a system would need extensive characterization to calibrate each sensor and ensure good quality data capture. A completely calibrated WIMU system would allow for motion capture in a wider range of real-world, non-laboratory based applications. Calibration can be a complex task, particularly for newer, multi-sensing range capable inertial sensors. As such we present an automated system for quickly and easily calibrating inertial sensors in a packaged WIMU, demonstrating some of the improvements in accuracy attainable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a by-product of the ‘information revolution’ which is currently unfolding, lifetimes of man (and indeed computer) hours are being allocated for the automated and intelligent interpretation of data. This is particularly true in medical and clinical settings, where research into machine-assisted diagnosis of physiological conditions gains momentum daily. Of the conditions which have been addressed, however, automated classification of allergy has not been investigated, even though the numbers of allergic persons are rising, and undiagnosed allergies are most likely to elicit fatal consequences. On the basis of the observations of allergists who conduct oral food challenges (OFCs), activity-based analyses of allergy tests were performed. Algorithms were investigated and validated by a pilot study which verified that accelerometer-based inquiry of human movements is particularly well-suited for objective appraisal of activity. However, when these analyses were applied to OFCs, accelerometer-based investigations were found to provide very poor separation between allergic and non-allergic persons, and it was concluded that the avenues explored in this thesis are inadequate for the classification of allergy. Heart rate variability (HRV) analysis is known to provide very significant diagnostic information for many conditions. Owing to this, electrocardiograms (ECGs) were recorded during OFCs for the purpose of assessing the effect that allergy induces on HRV features. It was found that with appropriate analysis, excellent separation between allergic and nonallergic subjects can be obtained. These results were, however, obtained with manual QRS annotations, and these are not a viable methodology for real-time diagnostic applications. Even so, this was the first work which has categorically correlated changes in HRV features to the onset of allergic events, and manual annotations yield undeniable affirmation of this. Fostered by the successful results which were obtained with manual classifications, automatic QRS detection algorithms were investigated to facilitate the fully automated classification of allergy. The results which were obtained by this process are very promising. Most importantly, the work that is presented in this thesis did not obtain any false positive classifications. This is a most desirable result for OFC classification, as it allows complete confidence to be attributed to classifications of allergy. Furthermore, these results could be particularly advantageous in clinical settings, as machine-based classification can detect the onset of allergy which can allow for early termination of OFCs. Consequently, machine-based monitoring of OFCs has in this work been shown to possess the capacity to significantly and safely advance the current state of clinical art of allergy diagnosis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ribosome profiling (ribo-seq) is a recently developed technique that provides genomewide information on protein synthesis (GWIPS) in vivo. The high resolution of ribo-seq is one of the exciting properties of this technique. In Chapter 2, I present a computational method that utilises the sub-codon precision and triplet periodicity of ribosome profiling data to detect transitions in the translated reading frame. Application of this method to ribosome profiling data generated for human HeLa cells allowed us to detect several human genes where the same genomic segment is translated in more than one reading frame. Since the initial publication of the ribosome profiling technique in 2009, there has been a proliferation of studies that have used the technique to explore various questions with respect to translation. A review of the many uses and adaptations of the technique is provided in Chapter 1. Indeed, owing to the increasing popularity of the technique and the growing number of published ribosome profiling datasets, we have developed GWIPS-viz (http://gwips.ucc.ie), a ribo-seq dedicated genome browser. Details on the development of the browser and its usage are provided in Chapter 3. One of the surprising findings of ribosome profiling of initiating ribosomes carried out in 3 independent studies, was the widespread use of non-AUG codons as translation initiation start sites in mammals. Although initiation at non-AUG codons in mammals has been documented for some time, the extent of non-AUG initiation reported by these ribo-seq studies was unexpected. In Chapter 4, I present an approach for estimating the strength of initiating codons based on the leaky scanning model of translation initiation. Application of this approach to ribo-seq data illustrates that initiation at non-AUG codons is inefficient compared to initiation at AUG codons. In addition, our approach provides a probability of initiation score for each start site that allows its strength of initiation to be evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis is a historical and philological study of the mature political theory of Miki Kiyoshi (1897-1945) focused on Philosophical Foundations of Cooperative Communitarianism (1939), a full translation of which is included. As the name suggests, it was a methodological and normative communitarianism, which critically built on liberalism, Marxism and Confucianism to realise a regional political community. Some of Miki’s Western readers have wrongly considered him a fascist ideologue, while he has been considered a humanist Marxist in Japan. A closer reading cannot support either view. The thesis argues that the Anglophone study of Japanese philosophy is a degenerating research programme ripe for revolution in the sense of returning full circle to an original point. That means returning to the texts, reading them contextually and philologically, in principle as early modern European political theory is read by intellectual historians, such as the representatives of Cambridge School history of political thought. The resulting reading builds critically on the Japanese scholarship and relates it to contemporary Western and postcolonial political theory and the East Asian tradition, particularly neo-Confucianism. The thesis argues for a Cambridge School perspective radicalised by the critical addendum of geo-cultural context, supplemented by Geertzian intercultural hermeneutics and a Saidian ‘return to philology’. As against those who have seen radical reorientations in Miki’s political thought, the thesis finds gradual progression and continuity between his neo-Kantian, existentialist, Marxian anthropology, Hegelian and finally communitarian phases. The theoretical underpinnings are his philosophical anthropology, a structurationist social theory of praxis, and a critique of liberalism, Marxism, nationalism and idealism emphasising concrete as opposed to abstract theory and the need to build on existing cultural traditions to modernise rather than westernise East Asia. This post-Western fusion was imagined to be the beginning of a true and pluralistic universalism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative analysis of penetrative deformation in sedimentary rocks of fold and thrust belts has largely been carried out using clast based strain analysis techniques. These methods analyse the geometric deviations from an original state that populations of clasts, or strain markers, have undergone. The characterisation of these geometric changes, or strain, in the early stages of rock deformation is not entirely straight forward. This is in part due to the paucity of information on the original state of the strain markers, but also the uncertainty of the relative rheological properties of the strain markers and their matrix during deformation, as well as the interaction of two competing fabrics, such as bedding and cleavage. Furthermore one of the single largest setbacks for accurate strain analysis has been associated with the methods themselves, they are traditionally time consuming, labour intensive and results can vary between users. A suite of semi-automated techniques have been tested and found to work very well, but in low strain environments the problems discussed above persist. Additionally these techniques have been compared to Anisotropy of Magnetic Susceptibility (AMS) analyses, which is a particularly sensitive tool for the characterisation of low strain in sedimentary lithologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis, Reading Lydgate's Troy Book: Patronage, Politics and History in Lancastrian England, discusses the relationship between John Lydgate as a court poet to his patron Henry V. I contend that the Troy Book is explored as a vehicle to propagate the idea that the House of Lancaster is the legitimate successor to King Richard II in order to smooth over the usurpation of 1399. Paul Strohm's England's Empty Throne was a key influence to the approach of this thesis' topic. I examine that although Chaucer had a definitive impact on Lydgate's writing, Lydgate is able to manipulate this influence for his own ambitions. In order to enhance his own fame, Lydgate works to promote Chaucer's canon so that as Chaucer's successor, he will inherit more prestige. The Trojan war is seen in context with the Hundred Years War, and can be applied contextually to political events. Lydgate presents characters that are vulnerable to human failings, and their assorted, complicated relationships. Lydgate modernises the Troy Book to reflect and enhance his Lancastrian society, and the thesis gives a contextual view of Lydgate's writing of the Troy Book. Lydgate writes for a more varied target audience than his thirteenth-century source, Guido delle Colonne, and there is a deliberation on the female characters of the Troy Book which promulgates the theory that Lydgate takes a proactive and empathetic interest in women's roles in society. Furthermore Lydgate has never really been accepted as a humanist, and I look at Lydgate's work from a different angle; he is a self-germinating humanist. Lydgate revives antiquity to educate his fifteenth-century audience, and his ambition is to create a memorial for his patron in the vernacular, and enhance his own fame as a poet separate from Chaucer's shadow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to present visual art as a paradigm for philosophy, Merleau-Ponty investigated the creative processes of artists whose work corresponded closely with his philosophical ideas. His essays on art are widely valued for emphasising process over product, and for challenging the primacy of the written word in all spheres of human expression. While it is clear that he initially favoured painting, Merleau-Ponty began to develop a much deeper understanding of the complexities of how art is made in his late work in parallel with his advancement of a new ontology. Although his ontology remains unfinished and only exists as working notes and a manuscript entitled The Visible and Invisible, Merleau-Ponty had begun to appreciate the fundamental role drawing plays in the making of art and the creation of a language of expression that is as vital as the written or spoken word. Through an examination of Merleau-Ponty’s unfinished manuscript and working notes my thesis will investigate his working methods and use of materials and also explore how he processed his ideas by using my own art practice as the basis of my research. This research will take the form of an inquiry into how the unfinished and incomplete nature of text and artworks, while they are still ‘works in progress’, can often reveal the more human and carnal components of creative processes. Applying my experience as a practitioner and a teacher in an art school, I focus on the significance of drawing practice for Merleau-Ponty’s later work, in order to rebalance an overemphasis on painting in the literature. Understanding the differences between these two art forms, and how they are taught, can offer an alternative engagement with Merleau-Ponty’s later work and his struggle to find a language to express his developing new ontology. In addition, by re-reading his work through the language of drawing, I believe we gain new insights which reaffirm Merleau-Ponty's relevance to contemporary art making and aesthetics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Depletional strategies directed toward achieving tolerance induction in organ transplantation have been associated with an increased incidence and risk of antibody-mediated rejection (AMR) and graft injury. Our clinical data suggest correlation of increased serum B cell activating factor/survival factor (BAFF) with increased risk of antibody-mediated rejection in alemtuzumab treated patients. In the present study, we tested the ability of BAFF blockade (TACI-Ig) in a nonhuman primate AMR model to prevent alloantibody production and prolong allograft survival. Three animals received the AMR inducing regimen (CD3-IT/alefacept/tacrolimus) with TACI-Ig (atacicept), compared to five control animals treated with the AMR inducing regimen only. TACI-Ig treatment lead to decreased levels of DSA in treated animals at 2 and 4 weeks posttransplantation (p < 0.05). In addition, peripheral B cell numbers were significantly lower at 6 weeks posttransplantation. However, it provided only a marginal increase in graft survival (59 ± 22 vs. 102 ± 47 days; p = 0.11). Histological analysis revealed a substantial reduction in findings typically associated with humoral rejection with atacicept treatment. More T cell rejection findings were observed with increased graft T cell infiltration in atacicept treatment, likely secondary to the graft prolongation. We show that BAFF/APRIL blockade using concomitant TACI-Ig treatment reduced the humoral portion of rejection in our depletion-induced preclinical AMR model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four experiments examined participants' ability to produce surface characteristics of sentences using an on-line story reading task. Participants read a series of stories in which either all, or the majority of sentences were written in the same "style," or surface form. Twice per story, participants were asked to fill in a blank consistent with the story. For sentences that contained three stylistic regularities, participants imitated either all three characteristics (Experiment 2) or two of the three characteristics (Experiment 1), depending on the proportion of in-style sentences. Participants demonstrated a recognition bias for the read style in an unannounced recognition task. When participants read stories in which the two styles were the dative/double object alternation, participants demonstrated a syntactic priming effect in the cloze task, but no consistent recognition bias in a later recognition test (Experiments 3 and 4).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Histopathology is the clinical standard for tissue diagnosis. However, histopathology has several limitations including that it requires tissue processing, which can take 30 minutes or more, and requires a highly trained pathologist to diagnose the tissue. Additionally, the diagnosis is qualitative, and the lack of quantitation leads to possible observer-specific diagnosis. Taken together, it is difficult to diagnose tissue at the point of care using histopathology.

Several clinical situations could benefit from more rapid and automated histological processing, which could reduce the time and the number of steps required between obtaining a fresh tissue specimen and rendering a diagnosis. For example, there is need for rapid detection of residual cancer on the surface of tumor resection specimens during excisional surgeries, which is known as intraoperative tumor margin assessment. Additionally, rapid assessment of biopsy specimens at the point-of-care could enable clinicians to confirm that a suspicious lesion is successfully sampled, thus preventing an unnecessary repeat biopsy procedure. Rapid and low cost histological processing could also be potentially useful in settings lacking the human resources and equipment necessary to perform standard histologic assessment. Lastly, automated interpretation of tissue samples could potentially reduce inter-observer error, particularly in the diagnosis of borderline lesions.

To address these needs, high quality microscopic images of the tissue must be obtained in rapid timeframes, in order for a pathologic assessment to be useful for guiding the intervention. Optical microscopy is a powerful technique to obtain high-resolution images of tissue morphology in real-time at the point of care, without the need for tissue processing. In particular, a number of groups have combined fluorescence microscopy with vital fluorescent stains to visualize micro-anatomical features of thick (i.e. unsectioned or unprocessed) tissue. However, robust methods for segmentation and quantitative analysis of heterogeneous images are essential to enable automated diagnosis. Thus, the goal of this work was to obtain high resolution imaging of tissue morphology through employing fluorescence microscopy and vital fluorescent stains and to develop a quantitative strategy to segment and quantify tissue features in heterogeneous images, such as nuclei and the surrounding stroma, which will enable automated diagnosis of thick tissues.

To achieve these goals, three specific aims were proposed. The first aim was to develop an image processing method that can differentiate nuclei from background tissue heterogeneity and enable automated diagnosis of thick tissue at the point of care. A computational technique called sparse component analysis (SCA) was adapted to isolate features of interest, such as nuclei, from the background. SCA has been used previously in the image processing community for image compression, enhancement, and restoration, but has never been applied to separate distinct tissue types in a heterogeneous image. In combination with a high resolution fluorescence microendoscope (HRME) and a contrast agent acriflavine, the utility of this technique was demonstrated through imaging preclinical sarcoma tumor margins. Acriflavine localizes to the nuclei of cells where it reversibly associates with RNA and DNA. Additionally, acriflavine shows some affinity for collagen and muscle. SCA was adapted to isolate acriflavine positive features or APFs (which correspond to RNA and DNA) from background tissue heterogeneity. The circle transform (CT) was applied to the SCA output to quantify the size and density of overlapping APFs. The sensitivity of the SCA+CT approach to variations in APF size, density and background heterogeneity was demonstrated through simulations. Specifically, SCA+CT achieved the lowest errors for higher contrast ratios and larger APF sizes. When applied to tissue images of excised sarcoma margins, SCA+CT correctly isolated APFs and showed consistently increased density in tumor and tumor + muscle images compared to images containing muscle. Next, variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was further tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. The results indicate that SCA+CT can accurately delineate APFs in heterogeneous tissue, which is essential to enable automated and rapid surveillance of tissue pathology.

Two primary challenges were identified in the work in aim 1. First, while SCA can be used to isolate features, such as APFs, from heterogeneous images, its performance is limited by the contrast between APFs and the background. Second, while it is feasible to create mosaics by scanning a sarcoma tumor bed in a mouse, which is on the order of 3-7 mm in any one dimension, it is not feasible to evaluate an entire human surgical margin. Thus, improvements to the microscopic imaging system were made to (1) improve image contrast through rejecting out-of-focus background fluorescence and to (2) increase the field of view (FOV) while maintaining the sub-cellular resolution needed for delineation of nuclei. To address these challenges, a technique called structured illumination microscopy (SIM) was employed in which the entire FOV is illuminated with a defined spatial pattern rather than scanning a focal spot, such as in confocal microscopy.

Thus, the second aim was to improve image contrast and increase the FOV through employing wide-field, non-contact structured illumination microscopy and optimize the segmentation algorithm for new imaging modality. Both image contrast and FOV were increased through the development of a wide-field fluorescence SIM system. Clear improvement in image contrast was seen in structured illumination images compared to uniform illumination images. Additionally, the FOV is over 13X larger than the fluorescence microendoscope used in aim 1. Initial segmentation results of SIM images revealed that SCA is unable to segment large numbers of APFs in the tumor images. Because the FOV of the SIM system is over 13X larger than the FOV of the fluorescence microendoscope, dense collections of APFs commonly seen in tumor images could no longer be sparsely represented, and the fundamental sparsity assumption associated with SCA was no longer met. Thus, an algorithm called maximally stable extremal regions (MSER) was investigated as an alternative approach for APF segmentation in SIM images. MSER was able to accurately segment large numbers of APFs in SIM images of tumor tissue. In addition to optimizing MSER for SIM image segmentation, an optimal frequency of the illumination pattern used in SIM was carefully selected because the image signal to noise ratio (SNR) is dependent on the grid frequency. A grid frequency of 31.7 mm-1 led to the highest SNR and lowest percent error associated with MSER segmentation.

Once MSER was optimized for SIM image segmentation and the optimal grid frequency was selected, a quantitative model was developed to diagnose mouse sarcoma tumor margins that were imaged ex vivo with SIM. Tumor margins were stained with acridine orange (AO) in aim 2 because AO was found to stain the sarcoma tissue more brightly than acriflavine. Both acriflavine and AO are intravital dyes, which have been shown to stain nuclei, skeletal muscle, and collagenous stroma. A tissue-type classification model was developed to differentiate localized regions (75x75 µm) of tumor from skeletal muscle and adipose tissue based on the MSER segmentation output. Specifically, a logistic regression model was used to classify each localized region. The logistic regression model yielded an output in terms of probability (0-100%) that tumor was located within each 75x75 µm region. The model performance was tested using a receiver operator characteristic (ROC) curve analysis that revealed 77% sensitivity and 81% specificity. For margin classification, the whole margin image was divided into localized regions and this tissue-type classification model was applied. In a subset of 6 margins (3 negative, 3 positive), it was shown that with a tumor probability threshold of 50%, 8% of all regions from negative margins exceeded this threshold, while over 17% of all regions exceeded the threshold in the positive margins. Thus, 8% of regions in negative margins were considered false positives. These false positive regions are likely due to the high density of APFs present in normal tissues, which clearly demonstrates a challenge in implementing this automatic algorithm based on AO staining alone.

Thus, the third aim was to improve the specificity of the diagnostic model through leveraging other sources of contrast. Modifications were made to the SIM system to enable fluorescence imaging at a variety of wavelengths. Specifically, the SIM system was modified to enabling imaging of red fluorescent protein (RFP) expressing sarcomas, which were used to delineate the location of tumor cells within each image. Initial analysis of AO stained panels confirmed that there was room for improvement in tumor detection, particularly in regards to false positive regions that were negative for RFP. One approach for improving the specificity of the diagnostic model was to investigate using a fluorophore that was more specific to staining tumor. Specifically, tetracycline was selected because it appeared to specifically stain freshly excised tumor tissue in a matter of minutes, and was non-toxic and stable in solution. Results indicated that tetracycline staining has promise for increasing the specificity of tumor detection in SIM images of a preclinical sarcoma model and further investigation is warranted.

In conclusion, this work presents the development of a combination of tools that is capable of automated segmentation and quantification of micro-anatomical images of thick tissue. When compared to the fluorescence microendoscope, wide-field multispectral fluorescence SIM imaging provided improved image contrast, a larger FOV with comparable resolution, and the ability to image a variety of fluorophores. MSER was an appropriate and rapid approach to segment dense collections of APFs from wide-field SIM images. Variables that reflect the morphology of the tissue, such as the density, size, and shape of nuclei and nucleoli, can be used to automatically diagnose SIM images. The clinical utility of SIM imaging and MSER segmentation to detect microscopic residual disease has been demonstrated by imaging excised preclinical sarcoma margins. Ultimately, this work demonstrates that fluorescence imaging of tissue micro-anatomy combined with a specialized algorithm for delineation and quantification of features is a means for rapid, non-destructive and automated detection of microscopic disease, which could improve cancer management in a variety of clinical scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Con la propuesta, en mi clase se vale “meter la pata”, pretendo desarrollar en los estudiantes las competencias matemáticas y ciudadanas, a través de la participación activa al interior de las clases. Para ello, parto de dos premisas: (a) el error como una oportunidad para generar conocimiento y (b) las preguntas como el medio para lograr llegar a conceptos claros y argumentos válidos en relación con el objeto matemático que se estudia. Desarrollo la propuesta a partir de tres tareas diseñadas en la unidad didáctica Razones trigonométricas vistas a través de múltiples lentes que se fundamenta en el modelo del análisis didáctico. Los resultados obtenidos hasta el momento reflejan un aumento en el interés que los estudiantes tienen por el área, en el respeto por las ideas de otros y en la utilización de argumentos válidos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

FUELCON is an expert system for optimized refueling design in nuclear engineering. This task is crucial for keeping down operating costs at a plant without compromising safety. FUELCON proposes sets of alternative configurations of allocation of fuel assemblies that are each positioned in the planar grid of a horizontal section of a reactor core. Results are simulated, and an expert user can also use FUELCON to revise rulesets and improve on his or her heuristics. The successful completion of FUELCON led this research team into undertaking a panoply of sequel projects, of which we provide a meta-architectural comparative formal discussion. In this paper, we demonstrate a novel adaptive technique that learns the optimal allocation heuristic for the various cores. The algorithm is a hybrid of a fine-grained neural network and symbolic computation components. This hybrid architecture is sensitive enough to learn the particular characteristics of the ‘in-core fuel management problem’ at hand, and is powerful enough to use this information fully to automatically revise heuristics, thus improving upon those provided by a human expert.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SMARTFIRE is a fire field model based on an open architecture integrated CFD code and knowledge-based system. It makes use of the expert system to assist the user in setting up the problem specification and new computational techniques such as Group Solvers to reduce the computational effort involved in solving the equations. This paper concentrates on recent research into the use of artificial intelligence techniques to assist in dynamic solution control of fire scenarios being simulated using fire field modelling techniques. This is designed to improve the convergence capabilities of the software while further decreasing the computational overheads. The technique automatically controls solver relaxations using an integrated production rule engine with a blackboard to monitor and implement the required control changes during solution processing. Initial results for a two-dimensional fire simulation are presented that demonstrate the potential for considerable savings in simulation run-times when compared with control sets from various sources. Furthermore, the results demonstrate enhanced solution reliability due to obtaining acceptable convergence within each time step unlike some of the comparison simulations.