954 resultados para Electromyography analysis techniques
Resumo:
Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^
Resumo:
This work is the first work using patterned soft underlayers in multilevel three-dimensional vertical magnetic data storage systems. The motivation stems from an exponentially growing information stockpile, and a corresponding need for more efficient storage devices with higher density. The world information stockpile currently exceeds 150EB (ExaByte=1x1018Bytes); most of which is in analog form. Among the storage technologies (semiconductor, optical and magnetic), magnetic hard disk drives are posed to occupy a big role in personal, network as well as corporate storage. However; this mode suffers from a limit known as the Superparamagnetic limit; which limits achievable areal density due to fundamental quantum mechanical stability requirements. There are many viable techniques considered to defer superparamagnetism into the 100's of Gbit/in2 such as: patterned media, Heat-Assisted Magnetic Recording (HAMR), Self Organized Magnetic Arrays (SOMA), antiferromagnetically coupled structures (AFC), and perpendicular magnetic recording. Nonetheless, these techniques utilize a single magnetic layer; and can thusly be viewed as two-dimensional in nature. In this work a novel three-dimensional vertical magnetic recording approach is proposed. This approach utilizes the entire thickness of a magnetic multilayer structure to store information; with potential areal density well into the Tbit/in2 regime. ^ There are several possible implementations for 3D magnetic recording; each presenting its own set of requirements, merits and challenges. The issues and considerations pertaining to the development of such systems will be examined, and analyzed using empirical and numerical analysis techniques. Two novel key approaches are proposed and developed: (1) Patterned soft underlayer (SUL) which allows for enhanced recording of thicker media, (2) A combinatorial approach for 3D media development that facilitates concurrent investigation of various film parameters on a predefined performance metric. A case study is presented using combinatorial overcoats of Tantalum and Zirconium Oxides for corrosion protection in magnetic media. ^ Feasibility of 3D recording is demonstrated, and an emphasis on 3D media development is emphasized as a key prerequisite. Patterned SUL shows significant enhancement over conventional "un-patterned" SUL, and shows that geometry can be used as a design tool to achieve favorable field distribution where magnetic storage and magnetic phenomena are involved. ^
A framework for transforming, analyzing, and realizing software designs in unified modeling language
Resumo:
Unified Modeling Language (UML) is the most comprehensive and widely accepted object-oriented modeling language due to its multi-paradigm modeling capabilities and easy to use graphical notations, with strong international organizational support and industrial production quality tool support. However, there is a lack of precise definition of the semantics of individual UML notations as well as the relationships among multiple UML models, which often introduces incomplete and inconsistent problems for software designs in UML, especially for complex systems. Furthermore, there is a lack of methodologies to ensure a correct implementation from a given UML design. The purpose of this investigation is to verify and validate software designs in UML, and to provide dependability assurance for the realization of a UML design.^ In my research, an approach is proposed to transform UML diagrams into a semantic domain, which is a formal component-based framework. The framework I proposed consists of components and interactions through message passing, which are modeled by two-layer algebraic high-level nets and transformation rules respectively. In the transformation approach, class diagrams, state machine diagrams and activity diagrams are transformed into component models, and transformation rules are extracted from interaction diagrams. By applying transformation rules to component models, a (sub)system model of one or more scenarios can be constructed. Various techniques such as model checking, Petri net analysis techniques can be adopted to check if UML designs are complete or consistent. A new component called property parser was developed and merged into the tool SAM Parser, which realize (sub)system models automatically. The property parser generates and weaves runtime monitoring code into system implementations automatically for dependability assurance. The framework in the investigation is creative and flexible since it not only can be explored to verify and validate UML designs, but also provides an approach to build models for various scenarios. As a result of my research, several kinds of previous ignored behavioral inconsistencies can be detected.^
Resumo:
Accurately assessing the extent of myocardial tissue injury induced by Myocardial infarction (MI) is critical to the planning and optimization of MI patient management. With this in mind, this study investigated the feasibility of using combined fluorescence and diffuse reflectance spectroscopy to characterize a myocardial infarct at the different stages of its development. An animal study was conducted using twenty male Sprague-Dawley rats with MI. In vivo fluorescence spectra at 337 nm excitation and diffuse reflectance between 400 nm and 900 nm were measured from the heart using a portable fiber-optic spectroscopic system. Spectral acquisition was performed on (1) the normal heart region; (2) the region immediately surrounding the infarct; and (3) the infarcted region—one, two, three and four weeks into MI development. The spectral data were divided into six subgroups according to the histopathological features associated with various degrees/severities of myocardial tissue injury as well as various stages of myocardial tissue remodeling, post infarction. Various data processing and analysis techniques were employed to recognize the representative spectral features corresponding to various histopathological features associated with myocardial infarction. The identified spectral features were utilized in discriminant analysis to further evaluate their effectiveness in classifying tissue injuries induced by MI. In this study, it was observed that MI induced significant alterations (p < 0.05) in the diffuse reflectance spectra, especially between 450 nm and 600 nm, from myocardial tissue within the infarcted and surrounding regions. In addition, MI induced a significant elevation in fluorescence intensities at 400 and 460 nm from the myocardial tissue from the same regions. The extent of these spectral alterations was related to the duration of the infarction. Using the spectral features identified, an effective tissue injury classification algorithm was developed which produced a satisfactory overall classification result (87.8%). The findings of this research support the concept that optical spectroscopy represents a useful tool to non-invasively determine the in vivo pathophysiological features of a myocardial infarct and its surrounding tissue, thereby providing valuable real-time feedback to surgeons during various surgical interventions for MI.
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
Recreational abuse of the drugs cocaine, methamphetamine, and morphine continues to be prevalent in the United States of America and around the world. While numerous methods of detection exist for each drug, they are generally limited by the lifetime of the parent drug and its metabolites in the body. However, the covalent modification of endogenous proteins by these drugs of abuse may act as biomarkers of exposure and allow for extension of detection windows for these drugs beyond the lifetime of parent molecules or metabolites in the free fraction. Additionally, existence of covalently bound molecules arising from drug ingestion can offer insight into downstream toxicities associated with each of these drugs. This research investigated the metabolism of cocaine, methamphetamine, and morphine in common in vitro assay systems, specifically focusing on the generation of reactive intermediates and metabolites that have the potential to form covalent protein adducts. Results demonstrated the formation of covalent adduction products between biological cysteine thiols and reactive moieties on cocaine and morphine metabolites. Rigorous mass spectrometric analysis in conjunction with in vitro metabolic activation, pharmacogenetic reaction phenotyping, and computational modeling were utilized to characterize structures and mechanisms of formation for each resultant thiol adduction product. For cocaine, data collected demonstrated the formation of adduction products from a reactive arene epoxide intermediate, designating a novel metabolic pathway for cocaine. In the case of morphine, data expanded on known adduct-forming pathways using sensitive and selective analysis techniques, following the known reactive metabolite, morphinone, and a proposed novel metabolite, morphine quinone methide. Data collected in this study describe novel metabolic events for multiple important drugs of abuse, culminating in detection methods and mechanistic descriptors useful to both medical and forensic investigators when examining the toxicology associated with cocaine, methamphetamine, and morphine.
Resumo:
Accurately assessing the extent of myocardial tissue injury induced by Myocardial infarction (MI) is critical to the planning and optimization of MI patient management. With this in mind, this study investigated the feasibility of using combined fluorescence and diffuse reflectance spectroscopy to characterize a myocardial infarct at the different stages of its development. An animal study was conducted using twenty male Sprague-Dawley rats with MI. In vivo fluorescence spectra at 337 nm excitation and diffuse reflectance between 400 nm and 900 nm were measured from the heart using a portable fiber-optic spectroscopic system. Spectral acquisition was performed on - (1) the normal heart region; (2) the region immediately surrounding the infarct; and (3) the infarcted region - one, two, three and four weeks into MI development. The spectral data were divided into six subgroups according to the histopathological features associated with various degrees / severities of myocardial tissue injury as well as various stages of myocardial tissue remodeling, post infarction. Various data processing and analysis techniques were employed to recognize the representative spectral features corresponding to various histopathological features associated with myocardial infarction. The identified spectral features were utilized in discriminant analysis to further evaluate their effectiveness in classifying tissue injuries induced by MI. In this study, it was observed that MI induced significant alterations (p < 0.05) in the diffuse reflectance spectra, especially between 450 nm and 600 nm, from myocardial tissue within the infarcted and surrounding regions. In addition, MI induced a significant elevation in fluorescence intensities at 400 and 460 nm from the myocardial tissue from the same regions. The extent of these spectral alterations was related to the duration of the infarction. Using the spectral features identified, an effective tissue injury classification algorithm was developed which produced a satisfactory overall classification result (87.8%). The findings of this research support the concept that optical spectroscopy represents a useful tool to non-invasively determine the in vivo pathophysiological features of a myocardial infarct and its surrounding tissue, thereby providing valuable real-time feedback to surgeons during various surgical interventions for MI.
Resumo:
Thirty seven deep-sea sediment cores from the Arabian Sea were studied geochemically (49 major and trace elements) for four time slices during the Holocene and the last glacial, and in one high sedimentation rate core (century scale resolution) to detect tracers of past variations in the intensity of the atmospheric monsoon circulation and its hydrographic expression in the ocean surface. This geochemical multi-tracer approach, coupled with additional information on the grain size composition of the clastic fraction, the bulk carbonate and biogenic opal contents makes it possible to characterize the sedimentological regime in detail. Sediments characterized by a specific elemental composition (enrichment) originated from the following sources: river suspensions from the Tapti and Narbada, draining the Indian Deccan traps (Ti, Sr); Indus sediments and dust from Rajasthan and Pakistan (Rb, Cs); dust from Iran and the Persian Gulf (Al, Cr); dust from central Arabia (Mg); dust from East Africa and the Red Sea (Zr/Hf, Ti/Al). Corg, Cd, Zn, Ba, Pb, U, and the HREE are associated with the intensity of upwelling in the western Arabian Sea, but only those patterns that are consistently reproduced by all of these elements can be directly linked with the intensity of the southwest monsoon. Relying on information from a single element can be misleading, as each element is affected by various other processes than upwelling intensity and nutrient content of surface water alone. The application of the geochemical multi-tracer approach indicates that the intensity of the southwest monsoon was low during the LGM, declined to a minimum from 15,000-13,000 14C year BP, intensified slightly at the end of this interval, was almost stable during the Bölling, Alleröd and the Younger Dryas, but then intensified in two abrupt successions at the end of the Younger Dryas (9900 14C year BP) and especially in a second event during the early Holocene (8800 14C year BP). Dust discharge by northwesterly winds from Arabia exhibited a similar evolution, but followed an opposite course: high during the LGM with two primary sources-the central Arabian desert and the dry Persian Gulf region. Dust discharge from both regions reached a pronounced maximum at 15,000-13,000 14C year. At the end of this interval, however, the dust plumes from the Persian Gulf area ceased dramatically, whereas dust discharge from central Arabia decreased only slightly. Dust discharge from East Africa and the Red Sea increased synchronously with the two major events of southwest monsoon intensification as recorded in the nutrient content of surface waters. In addition to the tracers of past dust flux and surface water nutrient content, the geochemical multi-tracer approach provides information on the history of deep sea ventilation (Mo, S), which was much lower during the last glacial maximum than during the Holocene. The multi-tracer approach-i.e. a few sedimentological parameters plus a set of geochemical tracers widely available from various multi-element analysis techniques-is a highly applicable technique for studying the complex sedimentation patterns of an ocean basin, and, specifically in the case of the Arabian Sea, can even reveal the seasonal structure of climate change.
Resumo:
Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.
Resumo:
The meaning of work is a construct that has been studied more systematically from the 80s, through various approaches and in different occupational categories. This dissertation aims to describe and discuss the meanings of work for construction workers. This is an empirical study whose research supports herself in the Model Attributes Meaning of Work and its respective instrument for measuring the Meaning of Work Inventory (STI). The research involved 402 workers in the construction industry sector in the two capitals of the Brazilian Northeast, with a mean age of 35.8 years (SD = 11.4). To collect the data, besides the IST, the Working Conditions Questionnaire and sociodemographic data were also used. Data were organized and analyzed using the SPSS program. The study used data analysis techniques to Smallest Space Analisys (SSA), descriptive statistics, correlation and analysis of variance. There was evidence of validity of the STI which was structured into five types of value attributes (what work should be), and seven types of descriptive attributes (what is working). The results showed that the work has high centrality and profiling for participants after the family, the most important aspect in the lives of workers. Aspects of personal and economic growth were more emphasized in definition of what the work should be and responsibility and effort were characteristics that best described the reality of work.
Resumo:
The strategy research have been widespread for many years and, more recently, the process of formation of the strategies in the individual perspective has also gained attention in academia. Confirming this trend, the goal of this study is to discuss the process of formation of the strategies from an individual perspective based on the three dimensions of the strategic process (change, thinking and formation) proposed by De Wit and Meyer (2004). To this end, this exploratory-descriptive study used the factor analysis techniques, non-parametric correlation and linear regression to analyze data collected from the decision makers of the 93 retail in the industry of construction supplies in the Natal and metropolitan area. As a result, we have that the formation factors of the dimensions investigated were identified in the majority, thus confirming the existence of paradoxes in the strategic process, and that there is a relationship between logical thinking and deliberate formation with the hierarchical level of decision makers.
Resumo:
Foundations support constitute one of the types of legal entities of private law forged with the purpose of supporting research projects, education and extension and institutional, scientific and technological development of Brazil. Observed as links of the relationship between company, university, and government, foundations supporting emerge in the Brazilian scene from the principle to establish an economic platform of development based on three pillars: science, technology and innovation – ST&I. In applied terms, these ones operate as tools of debureaucratisation making the management between public entities more agile, especially in the academic management in accordance with the approach of Triple Helix. From the exposed, the present study has as purpose understanding how the relation of Triple Helix intervenes in the fund-raising process of Brazilian foundations support. To understand the relations submitted, it was used the interaction models University-Company-Government recommended by Sábato and Botana (1968), the approach of the Triple Helix proposed by Etzkowitz and Leydesdorff (2000), as well as the perspective of the national innovation systems discussed by Freeman (1987, 1995), Nelson (1990, 1993) and Lundvall (1992). The research object of this study consists of 26 state foundations that support research associated with the National Council of the State Foundations of Supporting Research - CONFAP, as well as the 102 foundations in support of IES associated with the National Council of Foundations of Support for Institutions of Higher Education and Scientific and Technological Research – CONFIES, totaling 128 entities. As a research strategy, this study is considered as an applied research with a quantitative approach. Primary research data were collected using the e-mail Survey procedure. Seventy-five observations were collected, which corresponds to 58.59% of the research universe. It is considering the use of the bootstrap method in order to validate the use of the sample in the analysis of results. For data analysis, it was used descriptive statistics and multivariate data analysis techniques: the cluster analysis; the canonical correlation and the binary logistic regression. From the obtained canonical roots, the results indicated that the dependency relationship between the variables of relations (with the actors of the Triple Helix) and the financial resources invested in innovation projects is low, assuming the null hypothesis of this study, that the relations of the Triple Helix do not have interfered positively or negatively in raising funds for investments in innovation projects. On the other hand, the results obtained with the cluster analysis indicate that entities which have greater quantitative and financial amounts of projects are mostly large foundations (over 100 employees), which support up to five IES, publish management reports and use in their capital structure, greater financing of the public department. Finally, it is pertinent to note that the power of the classification of the logistic model obtained in this study showed high predictive capacity (80.0%) providing to the academic community replication in environments of similar analysis.
Resumo:
Composite NiO-C0.9Gd0.1O1.95 (NiO-GDC), one of the materials most used for the manufacture of anodes of Cells Solid Oxide Fuel (SOFC) currently, were obtained by a chemical route which consists in mixing the precursor solution of NiO and CGO phases obtained previously by the Pechini method. The nanopowders as-obtained were characterized by thermal analysis techniques (thermogravimetry and Differential Scanning Calorimetry) and calcined materials were evaluated by X-ray diffraction (XRD). Samples sintered between 1400 and 1500 ° C for 4 h were characterized by Archimedes method. The effects of the composition on the microstructure and electrical properties (conductivity and activation energy) of the composites sintered at 1500 ° C were investigated by electron microscopy and impedance spectroscopy (between 300 and 650 ° C in air). The refinement of the XRD data indicated that the powders are ultrafine and the crystallite size of the CGO phase decreases with increasing content of NiO. Similarly, the crystallite of the NiO phase tends to decrease with increasing concentration of CGO, especially above 50 wt % CGO. Analysis by Archimedes shows a variation in relative density due to the NiO content. Densities above 95% were obtained in samples containing from 50 wt % NiO and sintered between 1450 and 1500 °C. The results of microscopy and impedance spectroscopy indicate that from 30-40 wt.% NiO there is an increase in the number of contacts NiO - NiO, activating the electronic conduction mechanism which governs the process of conducting at low temperatures (300 - 500 °C). On the other hand, with increasing the measuring temperature the mobility of oxygen vacancies becomes larger than that of the electronic holes of NiO, as a result, the high temperature conductivity (500-650 ° C) in composites containing up to 30-40 wt.% of NiO is lower than that of CGO. Variations in activation energy confirm change of conduction mechanism with the increase of the NiO content. The composite containing 50 wt. % of each phase shows conductivity of 19 mS/cm at 650 °C (slightly higher than 13 mS/cm found for CGO) and activation energy of 0.49 eV.
Resumo:
This work presents a new ceramic material obtained through the incorporation of solid waste from the steel industry and known as dedusting powder PAE - in ceramic formulations based on clay, potassium and sodium feldspars, kaolin and talc. Formulations were prepared with ceramic residue levels of 0% (basic mass - MB), 2%, 4% and 8%, subjected to firing at temperatures of 1000 ° C, 1050ºC, 1100ºC and 1150ºC for periods of 15 min. and 120 min. The physicchemical and mechanical properties of these ceramic formulations were determined based on the firing temperature, residence time in the oven and the percentage of waste. Since the physicochemical and mechanical properties of the sintered materials were evaluated by chemical analysis techniques (fluorescence X-rays - FRX), particle size distribution, specific surface area, apparent density, structural analysis by diffraction of X-rays (DRX) and characterization of surface by scanning electron microscopy (SEM). The magnetic response characteristics and the pattern of magnetic ferrites of the samples were analyzed in the assay conditions, having noticed that the saturation magnetic susceptibility depend on the sintering temperature of the material and it is associated with its crystal structure. From the analysis results, it was concluded that the ceramic material with better physical and mechanical properties is obtained when the 8% from PAE residue is added to standard formulation under the burn time of 15 minutes and temperature of 1150ºC.
Resumo:
Esta pesquisa investiga a presença da imagem na capa do suplemento cultural dominical da Folha de S. Paulo, a Ilustríssima, a partir de um estudo de caso. O foco foi a análise das condições sociais e estéticas de produção dessa imagem de origem artística, levando em conta a mistura entre arte e jornalismo que o suplemento comporta e os conceitos de hibridação e de convergência. Técnicas da Análise de Discurso auxiliaram na análise da articulação entre as questões estéticas (condições textuais) e extratextuais (condições sociais), onde os sentidos são renovados a partir das tensões e contradições entre texto e contexto. A metodologia baseia-se nos Estudos Visuais, campo que tem o pensamento de Edgar Morin como principal influência, possibilitando-nos um olhar complexo sobre a produção da imagem presente na Ilustríssima.