911 resultados para architectural design -- data processing
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
Underwater sound is very important in the field of oceanography where it is used for remote sensing in much the same way that radar is used in atmospheric studies. One way to mathematically model sound propagation in the ocean is by using the parabolic-equation method, a technique that allows range dependent environmental parameters. More importantly, this method can model sound transmission where the source emits either a pure tone or a short pulse of sound. Based on the parabolic approximation method and using the split-step Fourier algorithm, a computer model for underwater sound propagation was designed and implemented. This computer model differs from previous models in its use of the interactive mode, structured programming, modular design, and state-of-the-art graphics displays. In addition, the model maximizes the efficiency of computer time through synchronization of loosely coupled dual processors and the design of a restart capability. Since the model is designed for adaptability and for users with limited computer skills, it is anticipated that it will have many applications in the scientific community.
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.
Resumo:
The great demand for Federal Institutions of High Education (IFES) design, triggered by the favourable political moment, boosts the public works market and brings with it the stigma of seeking the lowest cost and meet the federal Law 8.666/93 (Bids). In this context, this research makes an analysis of compliance with Fire Safety’s normative requirements in IFES’s architectural designs, taking this point as an evaluation of quality of designs. For the study, were used, IFES’s designs, specifically from UFERSA – Universidade Federal Rural do Semiárido and UFRN – Universidade Federal do Rio Grande do Norte, selected by the relationship use x public served and yet, in the replication of these buildings’ construction. The research was developed through the identification of the Fire Safety applicable legislation to the designs in question, with the determination of the demanded conditions that are architect’s autonomy or that affect the architectural design. Tabulated the requirements, through data collection and measures gathered in the blueprints, was made a comparison and verification of the compliance with these. The results of this evaluation reveal that the minimum requirements was not fulfilled and that IFES’s architectural designs, taken as object in this study, certainly will suffer restrictions in their regularization process with the Fire Department. It is concluded that is necessary an improvement in the IFES’s designs to meet the minimum fire safety regulations and improvement in quality. Moreover, the results direct to the understanding that level of knowledge about Fire Safety received in undergraduate by architects is insufficient for appropriate elaboration of architectural designs in this area.
Resumo:
The integration between architectural design and structur al systems consi sts, in academic education, one of the main challenges to the architectural design education . Recent studies point to the relevance of the use of computational tools in academic settings as an important strategy for such integration. Although in recent yea rs teaching experience using BIM (BuildingInformationModeling) may be incorporated by the a rchitecture schools , notes the need for further didactic and pedagogical practices that promote the architectural design and structur al integration teaching. This pa per analyzes experiences developed within the UFRN and UFPB, seeking to identify tools, processes and products used, pointing limitations and potentials in subjects taught in these institutions. The research begins with a literature review on teaching BIM and related aspects to the integration of architectural design and stru c tur e . It has been used as data collection techniques in studio the direct observation, the use of questionnaires and interviews with students and teachers, and mixed method, qualitativ e and quantitative analysis . In UFRN, the scope of the Integrated Workshop as a compulsory subject in the curriculum, favors the integration of disciplines studied here as it allows teachers from different disciplines at the same project studio . Regarding the use of BIM form initial users, BIM modelers, able to extract quantitative and automatically speed up production, gaining in quality in the products, however learn the tool and design in parallel cause some difficulties. UFPB, lack of required courses o n BIM, generates lack of knowledge and confidence in using the tool and processes, by most students. Thus we see the need for greater efforts by school to adopt BIM skills and training. There is a greater need for both BIM concept, in order to promote BIM process and consequent better use of tools, and obsolete avoiding impairment of technology, merely a tool. It is considered the inclusion of specific subjects with more advanced BIM skills, through partnerships with engineering degrees and the promotion of trans disciplinary integration favoring the exchange of different cultures from the academic environment.
Resumo:
Burn injuries in the United States account for over one million hospital admissions per year, with treatment estimated at four billion dollars. Of severe burn patients, 30-90% will develop hypertrophic scars (HSc). Current burn therapies rely upon the use of bioengineered skin equivalents (BSEs), which assist in wound healing but do not prevent HSc. HSc contraction occurs of 6-18 months and results in the formation of a fixed, inelastic skin deformity, with 60% of cases occurring across a joint. HSc contraction is characterized by abnormally high presence of contractile myofibroblasts which normally apoptose at the completion of the proliferative phase of wound healing. Additionally, clinical observation suggests that the likelihood of HSc is increased in injuries with a prolonged immune response. Given the pathogenesis of HSc, we hypothesize that BSEs should be designed with two key anti-scarring characterizes: (1) 3D architecture and surface chemistry to mitigate the inflammatory microenvironment and decrease myofibroblast transition; and (2) using materials which persist in the wound bed throughout the remodeling phase of repair. We employed electrospinning and 3D printing to generate scaffolds with well-controlled degradation rate, surface coatings, and 3D architecture to explore our hypothesis through four aims.
In the first aim, we evaluate the impact of elastomeric, randomly-oriented biostable polyurethane (PU) scaffold on HSc-related outcomes. In unwounded skin, native collagen is arranged randomly, elastin fibers are abundant, and myofibroblasts are absent. Conversely, in scar contractures, collagen is arranged in linear arrays and elastin fibers are few, while myofibroblast density is high. Randomly oriented collagen fibers native to the uninjured dermis encourage random cell alignment through contact guidance and do not transmit as much force as aligned collagen fibers. However, the linear ECM serves as a system for mechanotransduction between cells in a feed-forward mechanism, which perpetuates ECM remodeling and myofibroblast contraction. The electrospinning process allowed us to create scaffolds with randomly-oriented fibers that promote random collagen deposition and decrease myofibroblast formation. Compared to an in vitro HSc contraction model, fibroblast-seeded PU scaffolds significantly decreased matrix and myofibroblast formation. In a murine HSc model, collagen coated PU (ccPU) scaffolds significantly reduced HSc contraction as compared to untreated control wounds and wounds treated with the clinical standard of care. The data from this study suggest that electrospun ccPU scaffolds meet the requirements to mitigate HSc contraction including: reduction of in vitro HSc related outcomes, diminished scar stiffness, and reduced scar contraction. While clinical dogma suggests treating severe burn patients with rapidly biodegrading skin equivalents, these data suggest that a more long-term scaffold may possess merit in reducing HSc.
In the second aim, we further investigate the impact of scaffold longevity on HSc contraction by studying a degradable, elastomeric, randomly oriented, electrospun micro-fibrous scaffold fabricated from the copolymer poly(l-lactide-co-ε-caprolactone) (PLCL). PLCL scaffolds displayed appropriate elastomeric and tensile characteristics for implantation beneath a human skin graft. In vitro analysis using normal human dermal fibroblasts (NHDF) demonstrated that PLCL scaffolds decreased myofibroblast formation as compared to an in vitro HSc contraction model. Using our murine HSc contraction model, we found that HSc contraction was significantly greater in animals treated with standard of care, Integra, as compared to those treated with collagen coated-PLCL (ccPLCL) scaffolds at d 56 following implantation. Finally, wounds treated with ccPLCL were significantly less stiff than control wounds at d 56 in vivo. Together, these data further solidify our hypothesis that scaffolds which persist throughout the remodeling phase of repair represent a clinically translatable method to prevent HSc contraction.
In the third aim, we attempt to optimize cell-scaffold interactions by employing an anti-inflammatory coating on electrospun PLCL scaffolds. The anti-inflammatory sub-epidermal glycosaminoglycan, hyaluronic acid (HA) was used as a coating material for PLCL scaffolds to encourage a regenerative healing phenotype. To minimize local inflammation, an anti-TNFα monoclonal antibody (mAB) was conjugated to the HA backbone prior to PLCL coating. ELISA analysis confirmed mAB activity following conjugation to HA (HA+mAB), and following adsorption of HA+mAB to the PLCL backbone [(HA+mAB)PLCL]. Alican blue staining demonstrated thorough HA coating of PLCL scaffolds using pressure-driven adsorption. In vitro studies demonstrated that treatment with (HA+mAB)PLCL prevented downstream inflammatory events in mouse macrophages treated with soluble TNFα. In vivo studies using our murine HSc contraction model suggested positive impact of HA coating, which was partiall impeded by the inclusion of the TNFα mAB. Further characterization of the inflammatory microenvironment of our murine model is required prior to conclusions regarding the potential for anti-TNFα therapeutics for HSc. Together, our data demonstrate the development of a complex anti-inflammatory coating for PLCL scaffolds, and the potential impact of altering the ECM coating material on HSc contraction.
In the fourth aim, we investigate how scaffold design, specifically pore dimensions, can influence myofibroblast interactions and subsequent formation of OB-cadherin positive adherens junctions in vitro. We collaborated with Wake Forest University to produce 3D printed (3DP) scaffolds with well-controlled pore sizes we hypothesized that decreasing pore size would mitigate intra-cellular communication via OB-cadherin-positive adherens junctions. PU was 3D printed via pressure extrusion in basket-weave design with feature diameter of ~70 µm and pore sizes of 50, 100, or 150 µm. Tensile elastic moduli of 3DP scaffolds were similar to Integra; however, flexural moduli of 3DP were significantly greater than Integra. 3DP scaffolds demonstrated ~50% porosity. 24 h and 5 d western blot data demonstrated significant increases in OB-cadherin expression in 100 µm pores relative to 50 µm pores, suggesting that pore size may play a role in regulating cell-cell communication. To analyze the impact of pore size in these scaffolds on scarring in vivo, scaffolds were implanted beneath skin graft in a murine HSc model. While flexural stiffness resulted in graft necrosis by d 14, cellular and blood vessel integration into scaffolds was evident, suggesting potential for this design if employed in a less stiff material. In this study, we demonstrate for the first time that pore size alone impacts OB-cadherin protein expression in vitro, suggesting that pore size may play a role on adherens junction formation affiliated with the fibroblast-to-myofibroblast transition. Overall, this work introduces a new bioengineered scaffold design to both study the mechanism behind HSc and prevent the clinical burden of this contractile disease.
Together, these studies inform the field of critical design parameters in scaffold design for the prevention of HSc contraction. We propose that scaffold 3D architectural design, surface chemistry, and longevity can be employed as key design parameters during the development of next generation, low-cost scaffolds to mitigate post-burn hypertrophic scar contraction. The lessening of post-burn scarring and scar contraction would improve clinical practice by reducing medical expenditures, increasing patient survival, and dramatically improving quality of life for millions of patients worldwide.
Resumo:
A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.
Resumo:
Cloud computing offers massive scalability and elasticity required by many scien-tific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data.
Resumo:
This paper is based on the novel use of a very high fidelity decimation filter chain for Electrocardiogram (ECG) signal acquisition and data conversion. The multiplier-free and multi-stage structure of the proposed filters lower the power dissipation while minimizing the circuit area which are crucial design constraints to the wireless noninvasive wearable health monitoring products due to the scarce operational resources in their electronic implementation. The decimation ratio of the presented filter is 128, working in tandem with a 1-bit 3rd order Sigma Delta (ΣΔ) modulator which achieves 0.04 dB passband ripples and -74 dB stopband attenuation. The work reported here investigates the non-linear phase effects of the proposed decimation filters on the ECG signal by carrying out a comparative study after phase correction. It concludes that the enhanced phase linearity is not crucial for ECG acquisition and data conversion applications since the signal distortion of the acquired signal, due to phase non-linearity, is insignificant for both original and phase compensated filters. To the best of the authors’ knowledge, being free of signal distortion is essential as this might lead to misdiagnosis as stated in the state of the art. This article demonstrates that with their minimal power consumption and minimal signal distortion features, the proposed decimation filters can effectively be employed in biosignal data processing units.
Resumo:
This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.
Resumo:
Field-programmable gate arrays are ideal hosts to custom accelerators for signal, image, and data processing but de- mand manual register transfer level design if high performance and low cost are desired. High-level synthesis reduces this design burden but requires manual design of complex on-chip and off-chip memory architectures, a major limitation in applications such as video processing. This paper presents an approach to resolve this shortcoming. A constructive process is described that can derive such accelerators, including on- and off-chip memory storage from a C description such that a user-defined throughput constraint is met. By employing a novel statement-oriented approach, dataflow intermediate models are derived and used to support simple ap- proaches for on-/off-chip buffer partitioning, derivation of custom on-chip memory hierarchies and architecture transformation to ensure user-defined throughput constraints are met with minimum cost. When applied to accelerators for full search motion estima- tion, matrix multiplication, Sobel edge detection, and fast Fourier transform, it is shown how real-time performance up to an order of magnitude in advance of existing commercial HLS tools is enabled whilst including all requisite memory infrastructure. Further, op- timizations are presented that reduce the on-chip buffer capacity and physical resource cost by up to 96% and 75%, respectively, whilst maintaining real-time performance.
Resumo:
PEDRINI, Aldomar; SZOKOLAY, Steven. Recomendações para o desenvolvimento de uma ferramenta de suporte às primeiras decisões projetuais visando ao desempenho energético de edificações de escritório em clima quente. Ambiente Construído, Porto Alegre, v. 5, n. 1, p.39-54, jan./mar. 2005. Trimestral. Disponível em:
Resumo:
Wireless sensor networks (WSNs) differ from conventional distributed systems in many aspects. The resource limitation of sensor nodes, the ad-hoc communication and topology of the network, coupled with an unpredictable deployment environment are difficult non-functional constraints that must be carefully taken into account when developing software systems for a WSN. Thus, more research needs to be done on designing, implementing and maintaining software for WSNs. This thesis aims to contribute to research being done in this area by presenting an approach to WSN application development that will improve the reusability, flexibility, and maintainability of the software. Firstly, we present a programming model and software architecture aimed at describing WSN applications, independently of the underlying operating system and hardware. The proposed architecture is described and realized using the Model-Driven Architecture (MDA) standard in order to achieve satisfactory levels of encapsulation and abstraction when programming sensor nodes. Besides, we study different non-functional constrains of WSN application and propose two approaches to optimize the application to satisfy these constrains. A real prototype framework was built to demonstrate the developed solutions in the thesis. The framework implemented the programming model and the multi-layered software architecture as components. A graphical interface, code generation components and supporting tools were also included to help developers design, implement, optimize, and test the WSN software. Finally, we evaluate and critically assess the proposed concepts. Two case studies are provided to support the evaluation. The first case study, a framework evaluation, is designed to assess the ease at which novice and intermediate users can develop correct and power efficient WSN applications, the portability level achieved by developing applications at a high-level of abstraction, and the estimated overhead due to usage of the framework in terms of the footprint and executable code size of the application. In the second case study, we discuss the design, implementation and optimization of a real-world application named TempSense, where a sensor network is used to monitor the temperature within an area.
Resumo:
The Italian territory offers a wide range of treasures in the field of Cultural Assets. This is a highly relevant property, which needs an accurate management and preservation performed by appropriate tools, also giving attention to the maintenance and safeguard from risk factors. Nowadays the increasing development of new digital technologies, added by remarkable steps forward got by the subject of Geomatic makes possible an efficient integration among different techniques, helped also by spread of solutions to improve the data import-export and transmission between different devices. The main objective of this thesis is to experience the photogrammetric restitution implemented in a commercial software of digital photogrammetry, in order to generate a dense 3D model of the facade of the Basilica Sant'Apollinare Nuovo in Ravenna. The 1st Chapter, after a general introduction regarding the 3D survey of Cultural Heritage and some considerations linked to the use of digital photogrammetry in this field, is focused to analyze the case of stereoscopic and the monoscopic approach. In particular, it develops the theme of close-range photogrammetry. The 2nd Chapter, exposes the theme of digital images, from color theory until their appearing on the monitor. The 3rd Chapter, develops the case study of the Basilica di Sant'Apollinare Nuovo, the historical, architectural and religious of the same. Also, it is examined the issue of photogrammetry and laser scanning of the case study. The final part of the same chapter, treats the processing of data processing the software Agisoft PhotoScan, in order to generate, by means of Structure from Motion technique, a digital geometric 3D model of the Basilica Facade. The digital model has been scaled on the basis of measurements made on the field. With the software it was possible to accomplish the three phases of the photogrammetric data processing: internal orientation, exterior orientation and restitution.