959 resultados para Monitor Command System (Computer program)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research concerns the development and application of an analytical computer program, SAFE-ROC, that models material behaviour and structural behaviour of a slender reinforced concrete column that is part of an overall structure and is subjected to elevated temperatures as a result of exposure to fire. The analysis approach used in SAFE-RCC is non-linear. Computer calculations are used that take account of restraint and continuity, and the interaction of the column with the surrounding structure during the fire. Within a given time step an iterative approach is used to find a deformed shape for the column which results in equilibrium between the forces associated with the external loads and internal stresses and degradation. Non-linear geometric effects are taken into account by updating the geometry of the structure during deformation. The structural response program SAFE-ROC includes a total strain model which takes account of the compatibility of strain due to temperature and loading. The total strain model represents a constitutive law that governs the material behaviour for concrete and steel. The material behaviour models employed for concrete and steel take account of the dimensional changes caused by the temperature differentials and changes in the material mechanical properties with changes in temperature. Non-linear stress-strain laws are used that take account of loading to a strain greater than that corresponding to the peak stress of the concrete stress-strain relation, and model the inelastic deformation associated with unloading of the steel stress-strain relation. The cross section temperatures caused by the fire environment are obtained by a preceding non-linear thermal analysis, a computer program FIRES-T.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis encompasses an investigation of the behaviour of concrete frame structure under localised fire scenarios by implementing a constitutive model using finite-element computer program. The investigation phase included properties of material at elevated temperature, description of computer program, thermal and structural analyses. Transient thermal properties of material have been employed in this study to achieve reasonable results. The finite-element computer package of ANSYS is utilized in the present analyses to examine the effect of fire on the concrete frame under five various fire scenarios. In addition, a report of full-scale BRE Cardington concrete building designed to Eurocode2 and BS8110 subjected to realistic compartment fire is also presented. The transient analyses of present model included additional specific heat to the base value of dry concrete at temperature 100°C and 200°C. The combined convective-radiation heat transfer coefficient and transient thermal expansion have also been considered in the analyses. For the analyses with the transient strains included, the constitutive model based on empirical formula in a full thermal strain-stress model proposed by Li and Purkiss (2005) is employed. Comparisons between the models with and without transient strains included are also discussed. Results of present study indicate that the behaviour of complete structure is significantly different from the behaviour of individual isolated members based on current design methods. Although the current tabulated design procedures are conservative when the entire building performance is considered, it should be noted that the beneficial and detrimental effects of thermal expansion in complete structures should be taken into account. Therefore, developing new fire engineering methods from the study of complete structures rather than from individual isolated member behaviour is essential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Myopia is a refractive condition and develops because either the optical power of the eye is abnormally great or the eye is abnormally long, the optical consequences being that the focal length of the eye is too short for the physical length of the eye. The increase in axial length has been shown to match closely the dioptric error of the eye, in that a lmm increase in axial length usually generates 2 to 3D of myopia. The most common form of myopia is early-onset myopia (EO M) which occurs between 6 to 14 years of age. The second most common form of myopia is late-onset myopia (LOM) which emerges in late teens or early twenties, at a time when the eye should have ceased growing. The prevalence of LOM is increasing and research has indicated a link with excessive and sustained nearwork. The aim of this thesis was to examine the ocular biometric correlates associated with LOM and EOM development and progression. Biometric data was recorded on SO subjects, aged 16 to 26 years. The group was divided into 26 emmetropic subjects and 24 myopic subjects. Keratometry, corneal topography, ultrasonography, lens shape, central and peripheral refractive error, ocular blood flow and assessment of accommodation were measured on three occasions during an ISmonth to 2-year longitudinal study. Retinal contours were derived using a specially derived computer program. The thesis shows that myopia progression is related to an increase in vitreous chamber depth, a finding which supports previous work. The myopes exhibited hyperopic relative peripheral refractive error (PRE) and the emmetropes exhibited myopic relative PRE. Myopes demonstrated a prolate retinal shape and the retina became more prolate with myopia progression. The results show that a longitudinal, rather than equatorial, increase in the posterior segment is the principal structural correlate of myopia. Retinal shape, relative PRE and the ratio of axial length to corneal curvature have been indicated, in this thesis, as predictive factors for myopia onset and development. Data from this thesis demonstrates that myopia progression in the LOM group is the result of an increase in anterior segment power, owing to an increase in lens thickness, in conjunction with posterior segment elongation. Myopia progression in the EOM group is the product of a long posterior segment, which over-compensates for a weak anterior segment power. The weak anterior segment power in the EOM group is related to a combination of crystalline lens thinning and surface flattening. The results presented in this thesis confirm that posterior segment elongation is the main structural correlate in both EOM and LOM progression. The techniques and computer programs employed in the thesis are reproducible and robust providing a valuable framework for further myopia research and assessment of predictive factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many workers have studied the ocular components which occur in eyes exhibiting differing amounts of central refractive error but few have ever considered the additional information that could be derived from a study of peripheral refraction. Before now, peripheral refraction has either been measured in real eyes or has otherwise been modelled in schematic eyes of varying levels of sophistication. Several differences occur between measured and modelled results which, if accounted for, could give rise to more information regarding the nature of the optical and retinal surfaces and their asymmetries. Measurements of ocular components and peripheral refraction, however, have never been made in the same sample of eyes. In this study, ocular component and peripheral refractive measurements were made in a sample of young near-emmetropic, myopic and hyperopic eyes. The data for each refractive group was averaged. A computer program was written to construct spherical surfaced schematic eyes from this data. More sophisticated eye models were developed making use of linear algebraic ray tracing program. This method allowed rays to be traced through toroidal aspheric surfaces which were translated or rotated with respect to each other. For simplicity, the gradient index optical nature of the crystalline lens was neglected. Various alterations were made in these eye models to reproduce the measured peripheral refractive patterns. Excellent agreement was found between the modelled and measured peripheral refractive values over the central 70o of the visual field. This implied that the additional biometric features incorporated in each eye model were representative of those which were present in the measured eyes. As some of these features are not otherwise obtainable using in vivo techniques, it is proposed that the variation of refraction in the periphery offers a very useful optical method for studying human ocular component dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To use previously validated image analysis techniques to determine the incremental nature of printed subjective anterior eye grading scales. Methods: A purpose designed computer program was written to detect edges using a 3 × 3 kernal and to extract colour planes in the selected area of an image. Annunziato and Efron pictorial, and CCLRU and Vistakon-Synoptik photographic grades of bulbar hyperaemia, palpebral hyperaemia roughness, and corneal staining were analysed. Results: The increments of the grading scales were best described by a quadratic rather than a linear function. Edge detection and colour extraction image analysis for bulbar hyperaemia (r2 = 0.35-0.99), palpebral hyperaemia (r2 = 0.71-0.99), palpebral roughness (r2 = 0.30-0.94), and corneal staining (r2 = 0.57-0.99) correlated well with scale grades, although the increments varied in magnitude and direction between different scales. Repeated image analysis measures had a 95% confidence interval of between 0.02 (colour extraction) and 0.10 (edge detection) scale units (on a 0-4 scale). Conclusion: The printed grading scales were more sensitive for grading features of low severity, but grades were not comparable between grading scales. Palpebral hyperaemia and staining grading is complicated by the variable presentations possible. Image analysis techniques are 6-35 times more repeatable than subjective grading, with a sensitivity of 1.2-2.8% of the scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnification can be provided to assist those with visual impairment to make the best use of remaining vision. Electronic transverse magnification of an object was first conceived for use in low vision in the late 1950s, but has developed slowly and is not extensively prescribed because of its relatively high cost and lack of portability. Electronic devices providing transverse magnification have been termed closed-circuit televisions (CCTVs) because of the direct cable link between the camera imaging system and monitor viewing system, but this description generally refers to surveillance devices and does not indicate the provision of features such as magnification and contrast enhancement. Therefore, the term Electronic Vision Enhancement Systems (EVES) is proposed to better distinguish and describe such devices. This paper reviews current knowledge on EVES for the visually impaired in terms of: classification; hardware and software (development of technology, magnification and field-of-view, contrast and image enhancement); user aspects (users and usage, reading speed and duration, and training); and potential future development of EVES. © 2003 The College of Optometrists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To examine the use of image analysis to quantify changes in ocular physiology. Method: A purpose designed computer program was written to objectively quantify bulbar hyperaemia, tarsal redness, corneal staining and tarsal staining. Thresholding, colour extraction and edge detection paradigms were investigated. The repeatability (stability) of each technique to changes in image luminance was assessed. A clinical pictorial grading scale was analysed to examine the repeatability and validity of the chosen image analysis technique. Results: Edge detection using a 3 × 3 kernel was found to be the most stable to changes in image luminance (2.6% over a +60 to -90% luminance range) and correlated well with the CCLRU scale images of bulbar hyperaemia (r = 0.96), corneal staining (r = 0.85) and the staining of palpebral roughness (r = 0.96). Extraction of the red colour plane demonstrated the best correlation-sensitivity combination for palpebral hyperaemia (r = 0.96). Repeatability variability was <0.5%. Conclusions: Digital imaging, in conjunction with computerised image analysis, allows objective, clinically valid and repeatable quantification of ocular features. It offers the possibility of improved diagnosis and monitoring of changes in ocular physiology in clinical practice. © 2003 British Contact Lens Association. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of digital imaging and of digital networks has made duplication of original artwork easier. Watermarking techniques, also referred to as digital signature, sign images by introducing changes that are imperceptible to the human eye but easily recoverable by a computer program. Usage of error correcting codes is one of the good choices in order to correct possible errors when extracting the signature. In this paper, we present a scheme of error correction based on a combination of Reed-Solomon codes and another optimal linear code as inner code. We have investigated the strength of the noise that this scheme is steady to for a fixed capacity of the image and various lengths of the signature. Finally, we compare our results with other error correcting techniques that are used in watermarking. We have also created a computer program for image watermarking that uses the newly presented scheme for error correction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To assess the inter and intra observer variability of subjective grading of the retinal arterio-venous ratio (AVR) using a visual grading and to compare the subjectively derived grades to an objective method using a semi-automated computer program. Methods: Following intraocular pressure and blood pressure measurements all subjects underwent dilated fundus photography. 86 monochromatic retinal images with the optic nerve head centred (52 healthy volunteers) were obtained using a Zeiss FF450+ fundus camera. Arterio-venous ratios (AVR), central retinal artery equivalent (CRAE) and central retinal vein equivalent (CRVE) were calculated on three separate occasions by one single observer semi-automatically using the software VesselMap (ImedosSystems, Jena, Germany). Following the automated grading, three examiners graded the AVR visually on three separate occasions in order to assess their agreement. Results: Reproducibility of the semi-automatic parameters was excellent (ICCs: 0.97 (CRAE); 0.985 (CRVE) and 0.952 (AVR)). However, visual grading of AVR showed inter grader differences as well as discrepancies between subjectively derived and objectively calculated AVR (all p < 0.000001). Conclusion: Grader education and experience leads to inter-grader differences but more importantly, subjective grading is not capable to pick up subtle differences across healthy individuals and does not represent true AVR when compared with an objective assessment method. Technology advancements mean we no longer rely on opthalmoscopic evaluation but can capture and store fundus images with retinal cameras, enabling us to measure vessel calibre more accurately compared to visual estimation; hence it should be integrated in optometric practise for improved accuracy and reliability of clinical assessments of retinal vessel calibres. © 2014 Spanish General Council of Optometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article describes the structure of an ontology model for Optimization of a sequential program. The components of an intellectual modeling system for program optimization are described. The functions of the intellectual modeling system are defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering the so-called "multinomial discrete choice" model the focus of this paper is on the estimation problem of the parameters. Especially, the basic question arises how to carry out the point and interval estimation of the parameters when the model is mixed i.e. includes both individual and choice-specific explanatory variables while a standard MDC computer program is not available for use. The basic idea behind the solution is the use of the Cox-proportional hazards method of survival analysis which is available in any standard statistical package and provided a data structure satisfying certain special requirements it yields the MDC solutions desired. The paper describes the features of the data set to be analysed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this ethnographic study was to describe and explain the congruency of psychological preferences identified by the Myers-Briggs Type Indicator (MBTI) and the human resource development (HRD) role of instructor/facilitator. This investigation was conducted with 23 HRD professionals who worked in the Miami, Florida area as instructors/facilitators with adult learners in job-related contexts.^ The study was conducted using qualitative strategies of data collection and analysis. The research participants were selected through a purposive sampling strategy. Data collection strategies included: (a) administration and scoring of the MBTI, Form G, (b) open-ended and semi-structured interviews, (c) participant observations of the research subjects at their respective work sites and while conducting training sessions, (d) field notes, and (e) contact summary sheets to record field research encounters. Data analysis was conducted with the use of a computer program for qualitative analysis called FolioViews 3.1 for Windows. This included: (a) coding of transcribed interviews and field notes, (b) theme analysis, (c) memoing, and (d) cross-case analysis.^ The three major themes that emerged in relation to the congruency of psychological preferences and the role of instructor/facilitator were: (1) designing and preparing instruction/facilitation, (2) conducting training and managing group process, and (3) interpersonal relations and perspectives among instructors/facilitators.^ The first two themes were analyzed through the combination of the four Jungian personality functions. These combinations are: sensing-thinking (ST), sensing-feeling (SF), intuition-thinking (NT), and intuition-feeling (NF). The third theme was analyzed through the combination of the attitudes or energy focus and the judgment function. These combinations are: extraversion-thinking (ET), extraversion-feeling (EF), introversion-thinking (IT), and introversion-feeling (IF).^ A last area uncovered by this ethnographic study was the influence exerted by a training and development culture on the instructor/facilitator role. This professional culture is described and explained in terms of the shared values and expectations reported by the study respondents. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study involves one of the eight neighborhoods in the City of Miami named Little Havana. Little Havana, once a flourishing Hispanic community during the 1960s through the 1980s, is now experiencing housing deterioration, economic disinvestment, and increased social needs. ^ Although the City developed a Community Development Plan for the neighborhood addressing the neighborhood problems, needs, and objectives, it failed to address and take advantage of the area's prominent commercial street, Calle Ocho, as a cultural catalyst for the revitalization of the neighborhood. With an urban study and understanding of the area's needs for transit system improvements, program analysis, and a valuable architectural inventory, an intervention project can be developed. The project will capitalize on the area's historical and cultural assets and serve as a step towards altering the area's decline and revitalizing the street and community to recapture the energy present during the early years of the massive Cuban migration. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current reform initiatives recommend that geometry instruction include the study of three-dimensional geometric objects and provide students with opportunities to use spatial skills in problem-solving tasks. Geometer's Sketchpad (GSP) is a dynamic and interactive computer program that enables the user to investigate and explore geometric concepts and manipulate geometric structures. Research using GSP as an instructional tool has focused primarily on teaching and learning two-dimensional geometry. This study explored the effect of a GSP based instructional environment on students' geometric thinking and three-dimensional spatial ability as they used GSP to learn three-dimensional geometry. For 10 weeks, 18 tenth-grade students from an urban school district used GSP to construct and analyze dynamic, two-dimensional representations of three-dimensional objects in a classroom environment that encouraged exploration, discussion, conjecture, and verification. The data were collected primarily from participant observations and clinical interviews and analyzed using qualitative methods of analysis. In addition, pretest and posttest measures of three-dimensional spatial ability and van Hiele level of geometric thinking were obtained. Spatial ability measures were analyzed using standard t-test analysis. ^ The data from this study indicate that GSP is a viable tool to teach students about three-dimensional geometric objects. A comparison of students' pretest and posttest van Hiele levels showed an improvement in geometric thinking, especially for students on lower levels of the van Hiele theory. Evidence at the p < .05 level indicated that students' spatial ability improved significantly. Specifically, the GSP dynamic, visual environment supported students' visualization and reasoning processes as students attempted to solve challenging tasks about three-dimensional geometric objects. The GSP instructional activities also provided students with an experiential base and an intuitive understanding about three-dimensional objects from which more formal work in geometry could be pursued. This study demonstrates that by designing appropriate GSP based instructional environments, it is possible to help students improve their spatial skills, develop more coherent and accurate intuitions about three-dimensional geometric objects, and progress through the levels of geometric thinking proposed by van Hiele. ^