960 resultados para Computer Engineering|Electrical engineering


Relevância:

50.00% 50.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This dissertation evaluated the feasibility of using commercially available immortalized cell lines in building a tissue engineered in vitro blood-brain barrier (BBB) co-culture model for preliminary drug development studies. Mouse endothelial cell line and rat astrocyte cell lines purchased from American Type Culture Collections (ATCC) were the building blocks of the co-culture model. An astrocyte derived acellular extracellular matrix (aECM) was introduced in the co-culture model to provide a novel in vitro biomimetic basement membrane for the endothelial cells to form endothelial tight junctions. Trans-endothelial electrical resistance (TEER) and solute mass transport studies were engaged to quantitatively evaluate the tight junction formation on the in-vitro BBB models. Immuno-fluorescence microscopy and Western Blot analysis were used to qualitatively verify the in vitro expression of occludin, one of the earliest discovered tight junction proteins. Experimental data from a total of 12 experiments conclusively showed that the novel BBB in vitro co-culture model with the astrocyte derived aECM (CO+aECM) was promising in terms of establishing tight junction formation represented by TEER values, transport profiles and tight junction protein expression when compared with traditional co-culture (CO) model setups and endothelial cells cultured alone. Experimental data were also found to be comparable with several existing in vitro BBB models built from various methods. In vitro colorimetric sulforhodamine B (SRB) assay revealed that the co-cultured samples with aECM resulted in less cell loss on the basal sides of the insert membranes than that from traditional co-culture samples. The novel tissue engineering approach using immortalized cell lines with the addition of aECM was proven to be a relevant alternative to the traditional BBB in vitro modeling.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an alternate human computer interface that can be used by individuals with severe motor disabilities. The proposed design is based on the integration of a remote eye-gaze tracking system, voice recognition software, and a virtual keyboard. The methodology relies on a user profile that customizes eye gaze tracking using neural networks. The user profiling feature facilitates the notion of universal access to computing resources for a wide range of applications such as web browsing, email, word processing and editing. ^ The study is significant in terms of the integration of key algorithms to yield an adaptable and multimodal interface. The contributions of this dissertation stem from the following accomplishments: (a) establishment of the data transport mechanism between the eye-gaze system and the host computer yielding to a significantly low failure rate of 0.9%; (b) accurate translation of eye data into cursor movement through congregate steps which conclude with calibrated cursor coordinates using an improved conversion function; resulting in an average reduction of 70% of the disparity between the point of gaze and the actual position of the mouse cursor, compared with initial findings; (c) use of both a moving average and a trained neural network in order to minimize the jitter of the mouse cursor, which yield an average jittering reduction of 35%; (d) introduction of a new mathematical methodology to measure the degree of jittering of the mouse trajectory; (e) embedding an onscreen keyboard to facilitate text entry, and a graphical interface that is used to generate user profiles for system adaptability. ^ The adaptability nature of the interface is achieved through the establishment of user profiles, which may contain the jittering and voice characteristics of a particular user as well as a customized list of the most commonly used words ordered according to the user's preferences: in alphabetical or statistical order. This allows the system to successfully provide the capability of interacting with a computer. Every time any of the sub-system is retrained, the accuracy of the interface response improves even more. ^

Relevância:

50.00% 50.00%

Publicador:

Resumo:

With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Access control (AC) is a necessary defense against a large variety of security attacks on the resources of distributed enterprise applications. However, to be effective, AC in some application domains has to be fine-grain, support the use of application-specific factors in authorization decisions, as well as consistently and reliably enforce organization-wide authorization policies across enterprise applications. Because the existing middleware technologies do not provide a complete solution, application developers resort to embedding AC functionality in application systems. This coupling of AC functionality with application logic causes significant problems including tremendously difficult, costly and error prone development, integration, and overall ownership of application software. The way AC for application systems is engineered needs to be changed. In this dissertation, we propose an architectural approach for engineering AC mechanisms to address the above problems. First, we develop a framework for implementing the role-based access control (RBAC) model using AC mechanisms provided by CORBA Security. For those application domains where the granularity of CORBA controls and the expressiveness of RBAC model suffice, our framework addresses the stated problem. In the second and main part of our approach, we propose an architecture for an authorization service, RAD, to address the problem of controlling access to distributed application resources, when the granularity and support for complex policies by middleware AC mechanisms are inadequate. Applying this architecture, we developed a CORBA-based application authorization service (CAAS). Using CAAS, we studied the main properties of the architecture and showed how they can be substantiated by employing CORBA and Java technologies. Our approach enables a wide-ranging solution for controlling the resources of distributed enterprise applications.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

III-Nitride materials have recently become a promising candidate for superior applications over the current technologies. However, certain issues such as lack of native substrates, and high defect density have to be overcome for further development of III-Nitride technology. This work presents research on lattice engineering of III-Nitride materials, and the structural, optical, and electrical properties of its alloys, in order to approach the ideal material for various applications. We demonstrated the non-destructive and quantitative characterization of composition modulated nanostructure in InAlN thin films with X-ray diffraction. We found the development of the nanostructure depends on growth temperature, and the composition modulation has impacts on carrier recombination dynamics. We also showed that the controlled relaxation of a very thin AlN buffer (20 ~ 30 nm) or a graded composition InGaN buffer can significantly reduce the defect density of a subsequent epitaxial layer. Finally, we synthesized an InAlGaN thin films and a multi-quantum-well structure. Significant emission enhancement in the UVB range (280 – 320 nm) was observed compared to AlGaN thin films. The nature of the enhancement was investigated experimentally and numerically, suggesting carrier confinement in the In localization centers.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Stealthy attackers move patiently through computer networks - taking days, weeks or months to accomplish their objectives in order to avoid detection. As networks scale up in size and speed, monitoring for such attack attempts is increasingly a challenge. This paper presents an efficient monitoring technique for stealthy attacks. It investigates the feasibility of proposed method under number of different test cases and examines how design of the network affects the detection. A methodological way for tracing anonymous stealthy activities to their approximate sources is also presented. The Bayesian fusion along with traffic sampling is employed as a data reduction method. The proposed method has the ability to monitor stealthy activities using 10-20% size sampling rates without degrading the quality of detection.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The integral variability of raw materials, lack of awareness and appreciation of the technologies for achieving quality control and lack of appreciation of the micro and macro environmental conditions that the structures will be subjected, makes modern day concreting a challenge. This also makes Designers and Engineers adhere more closely to prescriptive standards developed for relatively less aggressive environments. The data from exposure sites and real structures prove, categorically, that the prescriptive specifications are inadequate for chloride environments. In light of this shortcoming, a more pragmatic approach would be to adopt performance-based specifications which are familiar to industry in the form of specification for mechanical strength. A recently completed RILEM technical committee made significant advances in making such an approach feasible.
Furthering a performance-based specification requires establishment of reliable laboratory and on-site test methods, as well as easy to perform service-life models. This article highlights both laboratory and on-site test methods for chloride diffusivity/electrical resistivity and the relationship between these tests for a range of concretes. Further, a performance-based approach using an on-site diffusivity test is outlined that can provide an easier to apply/adopt practice for Engineers and asset managers for specifying/testing concrete structures.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The development of scaffolds based on biomaterials is a promising strategy for Tissue Engineering and cellular regeneration. This work focuses on Bone Tissue Engineering, the aim is to develop electrically tailored biomaterials with different crystalline and electric features, and study their impacts onto cell biological behavior, so as to predict the materials output in the enhancement of bone tissue regeneration. It is accepted that bone exhibits piezoelectricity, a property that has been proved to be involved in bone growth/repair mechanism regulation. In addition electrical stimulations have been proved to influence bone growth and repair. Piezoelectric materials are therefore widely investigated for a potential use in bone tissue engineering. The main goal is the development of novel strategies to produce and employ piezoelectric biomaterials, with detailed knowledge of mechanisms involved in cell-material interaction. In the current work, poly (L-lactic) acid (PLLA), a synthetic semi-crystalline polymer, exhibiting biodegradibility, biocompatibility and piezoelectricity is studied and proposed as a promoter of enhanced tissue regeneration. PLLA has already been approved for implantation in human body by the Food and Drug Administration (FDA), and at the moment it is being used in several clinical strategies. The present study consists of first preparing films with different degrees of crystallinity and characterizing these PLLA films, in terms of surface and structural properties, and subsequently assessing the behavior of cells in terms of viability, proliferation, morphology and mineralization for each PLLA configuration. PLLA films were prepared using the solvent cast technique and submitted to different thermal treatments in order to obtain different degrees of crystallinity. Those platforms were then electrically poled, positively and negatively, by corona discharge in order to tailor their electrical properties. The cellular assays were conducted by using two different osteoblast cell lines grown directly onto the PLLA films:Human osteoblast Hob, a primary cell culture and Human osteosarcoma MG-63 cell line. This thesis gives also a comprehensive introduction to the area of Bone Tissue Engineering and provides a review of the work done in this field in the past until today, in that same field, including the one related with bone’s piezoelectricity. Then the experimental part deals with the effects of the crystallinity degrees and of the polarization in terms of surface properties and cellular bio assays. Three different degrees of crystallinity, and three different polarization conditions were prepared; which results in 9 different configurations under investigation.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background. Tremendous advances in biomaterials science and nanotechnologies, together with thorough research on stem cells, have recently promoted an intriguing development of regenerative medicine/tissue engineering. The nanotechnology represents a wide interdisciplinary field that implies the manipulation of different materials at nanometer level to achieve the creation of constructs that mimic the nanoscale-based architecture of native tissues. Aim. The purpose of this article is to highlight the significant new knowledges regarding this matter. Emerging acquisitions. To widen the range of scaffold materials resort has been carried out to either recombinant DNA technology-generated materials, such as a collagen-like protein, or the incorporation of bioactive molecules, such as RDG (arginine-glycine-aspartic acid), into synthetic products. Both the bottom-up and the top-down fabrication approaches may be properly used to respectively obtain sopramolecular architectures or, instead, micro-/nanostructures to incorporate them within a preexisting complex scaffold construct. Computer-aided design/manufacturing (CAD/CAM) scaffold technique allows to achieve patient-tailored organs. Stem cells, because of their peculiar properties - ability to proliferate, self-renew and specific cell-lineage differentiate under appropriate conditions - represent an attractive source for intriguing tissue engineering/regenerative medicine applications. Future research activities. New developments in the realization of different organs tissue engineering will depend on further progress of both the science of nanoscale-based materials and the knowledge of stem cell biology. Moreover the in vivo tissue engineering appears to be the logical step of the current research.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

El trabajo plantea un aporte al framework de ingeniería social (The Social Engineering Framework) para la evaluación del riesgo y mitigación de distintos vectores de ataque, por medio del análisis de árboles de ataque -- Adicionalmente se muestra una recopilación de estadísticas de ataques realizados a compañías de diferentes industrias relacionadas con la seguridad informática, enfocado en los ataques de ingeniería social y las consecuencias a las que se enfrentan las organizaciones -- Se acompañan las estadísticas con la descripción de ejemplos reales y sus consecuencias