884 resultados para Industrial efficiency -- Sri Lanka -- Measurement -- Data processing.
Resumo:
This paper examines the extent to which both network structure and spatial factors impact on the organizational performance of universities as measured by the generation of industrial research income. Drawing on data concerning the interactions of universities in the UK with large research and development (R&D)-intensive firms, the paper employs both social network analysis and regression analysis. It is found that the structural position of a university within networks with large R&D-intensive firms is significantly associated with the level of research income gained from industry. Spatial factors, on the other hand, are not found to be clearly associated with performance, suggesting that universities operate on a level playing field across regional environments once other factors are controlled for.
Resumo:
Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.
Resumo:
This dissertation presents an analysis of the impacts of trade policy reforms in Sri Lanka. A Computable General Equilibrium (CGE) model is constructed with detailed description of the domestic production structure and foreign trade. The model is then used to investigate the effects of trade policy reforms on resource allocation and welfare.^ Prior to 1977, Sri Lanka maintained stringent control over its imports through rigid quantitative restrictions. A new economic policy reform package was introduced in 1977, and it shifted Sri Lanka's development strategy toward an export oriented policy regime. The shift of policy focus from a restrictive trade regime toward a more open trade regime is expected to have a significant impact on the volume of external trade, domestic production structure, allocation of resources, and social welfare.^ Simulations are carried out to assess the effects of three major policy reforms: (1) a devaluation of the Sri Lanka rupee, (2) a partial or a complete elimination of export duties, and (3) a devaluation-cum-removal of export duties.^ Simulation results indicate that the macroeconomic impact of a devaluation-cum-removal of export duties can be substantial. They also suggest that the resource-pull effects of a devaluation and a devaluation-cum-export duty removal policy are significant. However, the model shows that a devaluation combined with an export duty reduction is likely to be a superior strategy. ^
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^
Resumo:
Underwater sound is very important in the field of oceanography where it is used for remote sensing in much the same way that radar is used in atmospheric studies. One way to mathematically model sound propagation in the ocean is by using the parabolic-equation method, a technique that allows range dependent environmental parameters. More importantly, this method can model sound transmission where the source emits either a pure tone or a short pulse of sound. Based on the parabolic approximation method and using the split-step Fourier algorithm, a computer model for underwater sound propagation was designed and implemented. This computer model differs from previous models in its use of the interactive mode, structured programming, modular design, and state-of-the-art graphics displays. In addition, the model maximizes the efficiency of computer time through synchronization of loosely coupled dual processors and the design of a restart capability. Since the model is designed for adaptability and for users with limited computer skills, it is anticipated that it will have many applications in the scientific community.
Resumo:
It is considered that the Strategic Alignment IT is the first step within the IT Governance process for any institution. Taking as initial point the recognition that the governance corporate has an overall view of the organizations, the IT Governance takes place as a sub-set responsible for the implementation of the organization strategies in what concerns the provision of the necessary tools for the achievement of the goals set in the Institutional Development Plan. In order to do so, COBIT specifies that such Governance shall be built on the following principles: Strategic Alignment, Value Delivery, Risk Management, Performance Measurement. This paper aims at the Strategic Alignment, considered by the authors as the foundation for the development of the entire IT Governance core. By deepening the technical knowledge of the management system development, UFRN has made a decisive step towards the technical empowerment needed to the “Value Delivery”, yet, by perusing the primarily set processes to the “Strategic Alignment”, gaps that limited the IT strategic view in the implementation of the organizational goals were found. In the qualitative study that used documentary research with content analysis and interviews with the strategic and tactical managers, the view on the role of SINFO – Superintendência de Informática was mapped. The documentary research was done on public documents present on the institutional site and on TCU – Tribunal de Contas da União – documents that map the IT Governance profiles on the federal public service as a whole. As a means to obtain the documentary research results equalization, questionnaires/interviews and iGovTI indexes, quantitative tools to the standardization of the results were used, always bearing in mind the usage of the same scale elements present in the TCU analysis. This being said, similarly to what the TCU study through the IGovTI index provides, this paper advocates a particular index to the study area – SA (Strategic Alignment), calculated from the representative variables of the COBIT 4.1 domains and having the representative variables of the Strategic Alignment primary process as components. As a result, an intermediate index among the values in two adjacent surveys done by TCU in the years of 2010 and 2012 was found, which reflects the attitude and view of managers towards the IT governance: still linked to Data Processing in which a department performs its tasks according to the demand of the various departments or sectors, although there is a commission that discusses the issues related to infrastructure acquisition and systems development. With an Operational view rather than Strategic/Managerial and low attachment to the tools consecrated by the market, several processes are not contemplated in the framework COBIT defined set; this is mainly due to the inexistence of a formal strategic plan for IT; hence, the partial congruency between the organization goals and the IT goals.
Resumo:
Sr, Nd, and Os isotopic data are presented for sediments from diverse locations in the Bay of Bengal. These data allow the samples to be divided into three groups, related to their sedimentary contexts. The first group, mainly composed of sediments from the shelf off Bangladesh and the currently active fan, has Sr and Nd characteristics consistent with a dominantly Himalayan source. Their 187Os/188Os ratios (~1.2-1.5) show that the average detrital material delivered by the Ganga-Brahmaputra (G-B) river system is not unusually radiogenic. A large difference in 187Os/188Os ratio exists between these Bengal Fan sediments and Ganga bedloads (187Os/188Os ~2.5, Pierson-Wickmann et al. (2000, doi:10.1016/S0012-821X(00)00003-0)). This difference mainly reflects addition of a less radiogenic Brahmaputra component, though mineralogical sorting and loss of radiogenic Os during transport may also play some role. The second sample group contains sediments from elsewhere in the Bay, particularly those located on the continental slope. They display Os isotopic compositions (0.99-1.11) similar to that of present seawater and higher Os and Re concentrations. These characteristics suggest the presence of a large hydrogenous contribution, consistent with the lower sedimentation rate of these samples. Sr and Nd ratios indicate that a significant fraction of these sediments is derived from erosion of non-Himalayan sources, such as the Indo-Burman range. These observations could be explained by the deflection of sediments from the G-B river system by westward currents in the head of the Bay. The third group contains only one sample, but shows that in addition to a Himalayan source, sediment discharge from Sri Lanka may influence the detrital component in the distal part of the fan. The similarity between the isotopic compositions of the group I R/V Sonne samples and those of Ocean Drilling Program Leg 116 (France-Lanord et al., 1993; Reisberg et al., 1997, doi:10.1016/S0012-821X(00)00003-0) suggests that the material eroding in the Himalayas has been roughly constant since the Miocene. The high Os isotopic ratios of leachates of both Sonne group I and Miocene Leg 116 sediments imply that much of the leachable highly radiogenic Os component was conserved during transport through the estuary or interaction with seawater. In constrast, samples with lower, but still relatively high, sedimentation rates (Sonne groups II and III and Pliocene Leg 116) seem to have significantly adsorbed or exchanged Os and Re with seawater. This suggests that in some cases the Os isotopic ratios of leachates of detrital sediments can be used to constrain the ancient marine Os record, or conversely, to date unfossiliferous sediments.
Resumo:
A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.
Resumo:
Cloud computing offers massive scalability and elasticity required by many scien-tific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data.