894 resultados para Testing and Debugging
Resumo:
Default ARTMAP combines winner-take-all category node activation during training , distributed activation during testing, and a set of default parameter values that define a ready-to-use, general-purpose neural network system for supervised learning and recognition. Winner-take-all ARTMAP learning is designed so that each input would make a correct prediction if re-presented immediately after its training presentation, passing the "next-input test." Distributed activation has been shown to improve test set prediction on many examples, but an input that made a correct winner-take-all prediction during training could make a different prediction with distributed activation. Default ARTMAP 2 introduces a distributed next-input test during training. On a number of benchmarks, this additional feature of the default system increases accuracy without significantly decreasing code compression. This paper includes a self-contained default ARTMAP 2 algorithm for implementation.
Resumo:
BACKGROUND/AIMS: as genetic and genomic research proliferates, debate has ensued about returning results to participants. In addition to consideration of the benefits and harms to participants, researchers must also consider the logistical and financial feasibility of returning research results. However, little data exist of actual researcher practices. METHODS: we conducted an online survey of 446 corresponding authors of genetic/genomic studies conducted in the United States and published in 2006-2007 to assess the frequency with which they considered, offered to, or actually returned research results, what factors influenced these decisions, and the method of communicating results. RESULTS: the response rate was 24% (105/446). Fifty-four percent of respondents considered the issue of returning research results to participants, 28% offered to return individual research results, and 24% actually returned individual research results. Of those who considered the issue of returning research results during the study planning phase, the most common factors considered were whether research results were deemed clinically useful (18%) and respect for participants (13%). Researchers who had a medical degree and conducted studies on children were significantly more likely to offer to return or actually return individual results compared to those with a Ph.D. only. CONCLUSIONS: we speculate that issues associated with clinical validity and respect for participants dominated concerns of time and expense given the prominent and continuing ethical debates surrounding genetics and genomics research. The substantial number of researchers who did not consider returning research results suggests that researchers and institutional review boards need to devote more attention to a topic about which research participants are interested.
Resumo:
In the summers of 1998 and 1999, the Archaeology in Annapolis project carried out archaeological investigation at the eighteenth century Dr. Upton Scott House site (18AP18)located at 4 Shipwright Street in the historic district of Annapolis, Anne Arundel County, Maryland. The Upton Scott House is significant as one of only a few Georgian houses with remnants of its original plantation-inspired landscape still visible (Graham 1998:147). Investigation was completed in agreement with the owners of the historic property, Mr. and Mrs. Paul Christian, who were interested in determining the condition and arrangement of Dr. Upton Scott’s well-documented pleasure gardens. Betty Cosans’ 1972 Archaeological Feasibility Report, the first real archaeological study of the Upton Scott House site, guided the research design and recovery efforts. Cosans determined that testing and survey in the back and side yards of the Scott property would yield important information on the use and history of the property, including that of Scott’s famous gardens. Excavation units and trenches were placed within three separate areas of backyard activity on the site which included Area One: extant brick stables in the southwest of the property; Area Two: the brick foundations of a small outbuilding located in the northwest area of the site; and Area Three: the area of Scott’s formal gardens. The research design included an interest in recovering evidence of African-American spiritual practice and domestic life at the site. Also of significant importance was an analysis of Scott’s garden beds, concerning the order and layout. Also sought was an understanding of the change in perception and use of the backyard by the various owners of the property.
Resumo:
The Maynard-Burgess House was excavated by Archaeology in Annapolis from Fall, 1990 to Summer, 1992. The still-standing house is located at 163 Duke of Gloucester Street in Annapolis' Historic District and is today being restored by Port of Annapolis, Incorporated. Archaeological testing and excavation of the site was developed alongside architectural analyses and archival research as the initial phase of the home's restoration. The Maynard-Burgess House was continuously occupied by two African-American families, the Maynards and the Burgesses, from the 1850s until the late 1980s. The main block of the house was built between 1850 and 1858 by the household of John T. Maynard, a free African American born in 1810,and his wife Maria Spencer Maynard. Maynard descendants lived in the home until it was foreclosed in 1908 and subsequently sold to the family of Willis and Mary Burgess in 1915. Willis had been a boarder in the home in 1880, and his sister Martha Ready had married John and Maria's son John Henry. Burgess descendants lived at the home until its sale in 1990.
Resumo:
Currently, no available pathological or molecular measures of tumor angiogenesis predict response to antiangiogenic therapies used in clinical practice. Recognizing that tumor endothelial cells (EC) and EC activation and survival signaling are the direct targets of these therapies, we sought to develop an automated platform for quantifying activity of critical signaling pathways and other biological events in EC of patient tumors by histopathology. Computer image analysis of EC in highly heterogeneous human tumors by a statistical classifier trained using examples selected by human experts performed poorly due to subjectivity and selection bias. We hypothesized that the analysis can be optimized by a more active process to aid experts in identifying informative training examples. To test this hypothesis, we incorporated a novel active learning (AL) algorithm into FARSIGHT image analysis software that aids the expert by seeking out informative examples for the operator to label. The resulting FARSIGHT-AL system identified EC with specificity and sensitivity consistently greater than 0.9 and outperformed traditional supervised classification algorithms. The system modeled individual operator preferences and generated reproducible results. Using the results of EC classification, we also quantified proliferation (Ki67) and activity in important signal transduction pathways (MAP kinase, STAT3) in immunostained human clear cell renal cell carcinoma and other tumors. FARSIGHT-AL enables characterization of EC in conventionally preserved human tumors in a more automated process suitable for testing and validating in clinical trials. The results of our study support a unique opportunity for quantifying angiogenesis in a manner that can now be tested for its ability to identify novel predictive and response biomarkers.
Resumo:
Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.
Resumo:
The Computer Aided Parallelisation Tools (CAPTools) [Ierotheou, C, Johnson SP, Cross M, Leggett PF, Computer aided parallelisation tools (CAPTools)-conceptual overview and performance on the parallelisation of structured mesh codes, Parallel Computing, 1996;22:163±195] is a set of interactive tools aimed to provide automatic parallelisation of serial FORTRAN Computational Mechanics (CM) programs. CAPTools analyses the user's serial code and then through stages of array partitioning, mask and communication calculation, generates parallel SPMD (Single Program Multiple Data) messages passing FORTRAN. The parallel code generated by CAPTools contains calls to a collection of routines that form the CAPTools communications Library (CAPLib). The library provides a portable layer and user friendly abstraction over the underlying parallel environment. CAPLib contains optimised message passing routines for data exchange between parallel processes and other utility routines for parallel execution control, initialisation and debugging. By compiling and linking with different implementations of the library, the user is able to run on many different parallel environments. Even with today's parallel systems the concept of a single version of a parallel application code is more of an aspiration than a reality. However for CM codes the data partitioning SPMD paradigm requires a relatively small set of message-passing communication calls. This set can be implemented as an intermediate `thin layer' library of message-passing calls that enables the parallel code (especially that generated automatically by a parallelisation tool such as CAPTools) to be as generic as possible. CAPLib is just such a `thin layer' message passing library that supports parallel CM codes, by mapping generic calls onto machine specific libraries (such as CRAY SHMEM) and portable general purpose libraries (such as PVM an MPI). This paper describe CAPLib together with its three perceived advantages over other routes: - as a high level abstraction, it is both easy to understand (especially when generated automatically by tools) and to implement by hand, for the CM community (who are not generally parallel computing specialists); - the one parallel version of the application code is truly generic and portable; - the parallel application can readily utilise whatever message passing libraries on a given machine yield optimum performance.
Resumo:
Predicting the reliability of newly designed products, before manufacture, is obviously highly desirable for many organisations. Understanding the impact of various design variables on reliability allows companies to optimise expenditure and release a package in minimum time. Reliability predictions originated in the early years of the electronics industry. These predictions were based on historical field data which has evolved into industrial databases and specifications such as the famous MIL-HDBK-217 standard, plus numerous others. Unfortunately the accuracy of such techniques is highly questionable especially for newly designed packages. This paper discusses the use of modelling to predict the reliability of high density flip-chip and BGA components. A number of design parameters are investigated at the assembly stage, during testing, and in-service.
Resumo:
In this paper a methodology for the application of computer simulation to the evacuation certification of aircraft is suggested. The methodology suggested here involves the use of computer simulation, historic certification data, component testing and full-scale certification trials. The proposed methodology sets out a protocol for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. Along with the suggested protocol, a phased introduction of computer models to certification is suggested. Given the sceptical nature of the aviation community regarding any certification methodology change in general, this would involve as a first step the use of computer simulation in conjunction with full-scale testing. The computer model would be used to reproduce a probability distribution of likely aircraft performance under current certification conditions and in addition, several other more challenging scenarios could be developed. The combination of full-scale trial, computer simulation (and if necessary component testing) would provide better insight into the actual performance capabilities of the aircraft by generating a performance probability distribution or performance envelope rather than a single datum. Once further confidence in the technique is established, the second step would only involve computer simulation and component testing. This would only be contemplated after sufficient experience and confidence in the use of computer models have been developed. The third step in the adoption of computer simulation for certification would involve the introduction of several scenarios based on for example exit availability instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios into the certification process. This would require the continued development of aircraft evacuation modelling technology to include additional behavioural features common in real accident scenarios.
Proposed methodology for the use of computer simulation to enhance aircraft evacuation certification
Resumo:
In this paper a methodology for the application of computer simulation to evacuation certification of aircraft is suggested. This involves the use of computer simulation, historic certification data, component testing, and full-scale certification trials. The methodology sets out a framework for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. In addition, a phased introduction of computer models to certification is suggested. This involves as a first step the use of computer simulation in conjunction with full-scale testing. The combination of full-scale trial, computer simulation (and if necessary component testing) provides better insight into aircraft evacuation performance capabilities by generating a performance probability distribution rather than a single datum. Once further confidence in the technique is established the requirement for the full-scale demonstration could be dropped. The second step in the adoption of computer simulation for certification involves the introduction of several scenarios based on, for example, exit availability, instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios. This would require the continued development of aircraft evacuation modeling technology to include additional behavioral features common in real accident scenarios.
Resumo:
This paper describes work towards the deployment of flexible self-management into real-time embedded systems. A challenging project which focuses specifically on the development of a dynamic, adaptive automotive middleware is described, and the specific self-management requirements of this project are discussed. These requirements have been identified through the refinement of a wide-ranging set of use cases requiring context-sensitive behaviours. A sample of these use-cases is presented to illustrate the extent of the demands for self-management. The strategy that has been adopted to achieve self-management, based on the use of policies is presented. The embedded and real-time nature of the target system brings the constraints that dynamic adaptation capabilities must not require changes to the run-time code (except during hot update of complete binary modules), adaptation decisions must have low latency, and because the target platforms are resource-constrained the self-management mechanism have low resource requirements (especially in terms of processing and memory). Policy-based computing is thus and ideal candidate for achieving the self-management because the policy itself is loaded at run-time and can be replaced or changed in the future in the same way that a data file is loaded. Policies represent a relatively low complexity and low risk means of achieving self-management, with low run-time costs. Policies can be stored internally in ROM (such as default policies) as well as externally to the system. The architecture of a designed-for-purpose powerful yet lightweight policy library is described. A suitable evaluation platform, supporting the whole life-cycle of feasibility analysis, concept evaluation, development, rigorous testing and behavioural validation has been devised and is described.
Resumo:
Purpose – This paper aims to present an open-ended microwave curing system for microelectronics components and a numerical analysis framework for virtual testing and prototyping of the system, enabling design of physical prototypes to be optimized, expediting the development process. Design/methodology/approach – An open-ended microwave oven system able to enhance the cure process for thermosetting polymer materials utilised in microelectronics applications is presented. The system is designed to be mounted on a precision placement machine enabling curing of individual components on a circuit board. The design of the system allows the heating pattern and heating rate to be carefully controlled optimising cure rate and cure quality. A multi-physics analysis approach has been adopted to form a numerical model capable of capturing the complex coupling that exists between physical processes. Electromagnetic analysis has been performed using a Yee finite-difference time-domain scheme, while an unstructured finite volume method has been utilized to perform thermophysical analysis. The two solvers are coupled using a sampling-based cross-mapping algorithm. Findings – The numerical results obtained demonstrate that the numerical model is able to obtain solutions for distribution of temperature, rate of cure, degree of cure and thermally induced stresses within an idealised polymer load heated by the proposed microwave system. Research limitations/implications – The work is limited by the absence of experimentally derived material property data and comparative experimental results. However, the model demonstrates that the proposed microwave system would seem to be a feasible method of expediting the cure rate of polymer materials. Originality/value – The findings of this paper will help to provide an understanding of the behaviour of thermosetting polymer materials during microwave cure processing.
Resumo:
A physically open, but electrically shielded, microwave open oven can be produced by virtue of the evanescent fields in a waveguide below cutoff. The below cutoff heating chamber is fed by a transverse magnetic resonance established in a dielectric-filled section of the waveguide exploiting continuity of normal electric flux. In order to optimize the fields and the performance of the oven, a thin layer of a dielectric material with higher permittivity is inserted at the interface. Analysis and synthesis of an optimized open oven predicts field enhancement in the heating chamber up to 9.4 dB. Results from experimental testing on two fabricated prototypes are in agreement with the simulated predictions, and demonstrate an up to tenfold improvement in the heating performance. The open-ended oven allows for simultaneous precision alignment, testing, and efficient curing of microelectronic devices, significantly increasing productivity gains.
Resumo:
Aims: To determine whether routine outpatient monitoring of growth predicts adrenal suppression in prepubertal children treated with high dose inhaled glucocorticoid.
Methods: Observational study of 35 prepubertal children (aged 4–10 years) treated with at least 1000 µg/day of inhaled budesonide or equivalent potency glucocorticoid for at least six months. Main outcome measures were: changes in HtSDS over 6 and 12 month periods preceding adrenal function testing, and increment and peak cortisol after stimulation by low dose tetracosactrin test. Adrenal suppression was defined as a peak cortisol 500 nmol/l.
Results: The areas under the receiver operator characteristic curves for a decrease in HtSDS as a predictor of adrenal insufficiency 6 and 12 months prior to adrenal testing were 0.50 (SE 0.10) and 0.59 (SE 0.10). Prediction values of an HtSDS change of –0.5 for adrenal insufficiency at 12 months prior to testing were: sensitivity 13%, specificity 95%, and positive likelihood ratio of 2.4. Peak cortisol reached correlated poorly with change in HtSDS ( = 0.23, p = 0.19 at 6 months; = 0.33, p = 0.06 at 12 months).
Conclusions: Monitoring growth does not enable prediction of which children treated with high dose inhaled glucocorticoids are at risk of potentially serious adrenal suppression. Both growth and adrenal function should be monitored in patients on high dose inhaled glucocorticoids. Further research is required to determine the optimal frequency of monitoring adrenal function.
Resumo:
Developing appropriate treatments for easel paintings can be complex, as many works are composed of various materials that respond in different ways. When selecting a filling material for these artworks, several properties are investigated including: the need for the infill to react to environmental conditions in a similar manner as the original material; the need for the infill to have good handling properties, adhesion to the original support, and cohesion within the filling material; the ability for the infill to withstand the stress of the surrounding material and; be as flexible as the original material to not cause further damage. Also, changes in colour or mechanical properties should not occur as part of the ageing process. Studies are needed on acrylic-based materials used as infills in conservation treatments. This research examines some of the chemical, physical, and optical changes of eleven filling materials before and after ageing, with the aim to evaluate the overall appropriateness of these materials as infills for easel paintings. The materials examined were three rabbit skin glue (RSG) gessoes, and seven commercially prepared acrylic materials, all easily acquired in North America. Chemical analysis was carried out with Fourier transform infrared (FTIR) spectroscopy and X-ray fluorescence (XRF), pyrolysis gas chromatography-mass spectroscopy (Py-GC/MS), and differential scanning calorimetry (DSC). Overall the compositions of the various materials examined were found to be in agreement with the available literature and previous research. This study also examined characteristics of these materials not described in previous works and, additionally, presented the compositions and behaviour of several commonly used materials with little literature description. After application of an ageing regimen, most naturally aged and artificially aged samples displayed small changes in gloss, colour, thickness, and diffusive behaviour; however, to evaluate these materials fully mechanical testing and environmental studies should be carried out.