753 resultados para packages
Resumo:
DNA extraction was carried out as described on the MICROBIS project pages (http://icomm.mbl.edu/microbis ) using a commercially available extraction kit. We amplified the hypervariable regions V4-V6 of archaeal and bacterial 16S rRNA genes using PCR and several sets of forward and reverse primers (http://vamps.mbl.edu/resources/primers.php). Massively parallel tag sequencing of the PCR products was carried out on a 454 Life Sciences GS FLX sequencer at Marine Biological Laboratory, Woods Hole, MA, following the same experimental conditions for all samples. Sequence reads were submitted to a rigorous quality control procedure based on mothur v30 (doi:10.1128/AEM.01541-09) including denoising of the flow grams using an algorithm based on PyroNoise (doi:10.1038/nmeth.1361), removal of PCR errors and a chimera check using uchime (doi:10.1093/bioinformatics/btr381). The reads were taxonomically assigned according to the SILVA taxonomy (SSURef v119, 07-2014; doi:10.1093/nar/gks1219) implemented in mothur and clustered at 98% ribosomal RNA gene V4-V6 sequence identity. V4-V6 amplicon sequence abundance tables were standardized to account for unequal sampling effort using 1000 (Archaea) and 2300 (Bacteria) randomly chosen sequences without replacement using mothur and then used to calculate inverse Simpson diversity indices and Chao1 richness (doi:10.2307/4615964). Bray-Curtis dissimilarities (doi:10.2307/1942268) between all samples were calculated and used for 2-dimensional non metric multidimensional scaling (NMDS) ordinations with 20 random starts (doi:10.1007/BF02289694). Stress values below 0.2 indicated that the multidimensional dataset was well represented by the 2D ordination. NMDS ordinations were compared and tested using Procrustes correlation analysis (doi:10.1007/BF02291478). All analyses were carried out with the R statistical environment and the packages vegan (available at: http://cran.r-project.org/package=vegan), labdsv (available at: http://cran.r-project.org/package=labdsv), as well as with custom R scripts. Operational taxonomic units at 98% sequence identity (OTU0.03) that occurred only once in the whole dataset were termed absolute single sequence OTUs (SSOabs; doi:10.1038/ismej.2011.132). OTU0.03 sequences that occurred only once in at least one sample, but may occur more often in other samples were termed relative single sequence OTUs (SSOrel). SSOrel are particularly interesting for community ecology, since they comprise rare organisms that might become abundant when conditions change.16S rRNA amplicons and metagenomic reads have been stored in the sequence read archive under SRA project accession number SRP042162.
Resumo:
A new package called adolist is presented. adolist is a tool to create, install, and uninstall lists of user ado-packages (“adolists”). For example, adolist can create a list of all user packages installed on a system and then install the same packages on another system. Moreover, ado-list can be used to put together thematic lists of packages such as, say, a list on income inequality analysis or time-series add-ons, or the list of “41 user ados everyone should know”. Such lists can then be shared with others, who can easily install and uninstall the listed packages using the adolist command.
Resumo:
A new command called adolist is presented. adolist is a tool to create, install, and uninstall lists of user ado-packages (“adolists”). For example, adolist can create a list of all user packages installed on a system and then install the same packages on another system. Moreover, ado-list can be used to put together thematic lists of packages such as, say, a list on income inequality analysis or time-series add-ons, or the list of “41 user ados everyone should know”. Such lists can then be shared with others, who can easily install and uninstall the listed packages using the adolist command.
Resumo:
Introduction: During the period from the latter half of the 1980s until just before the Asian currency crisis in 1997, Indonesia’s economic development had drawn expectations and attention from various quarters, along with Malaysia and Thailand within the same Association of Southeast Asian Nations (ASEAN). In fact, the 1993 report by the World Bank, entitled “East Asian Miracle: Economic Growth and Public Policy,” recognized Indonesia as one of the East Asian economies with the strong economic performance, i.e. sustained economic growth (World Bank [1993]). And it was the manufacturing industry that had been the driving force behind Indonesia’s economic growth during that period. Since the 1997 outbreak of the Asian currency crisis, however, the manufacturing sector in Indonesia has been mired in a situation that rules out the kind of bright prospects it had emanated previously. The Indonesian economy is still in the developing stage, and in accordance with the history of industrial structural changes in other countries, Indonesia’s manufacturing industry can still be expected to serve as the engine of the country’s economic development. But is it really possible in an environment where economic liberalization and globalization are forging ahead? And, what sort of problems have to be dealt with to make it possible? To answer these questions, it is necessary to know the current conditions of Indonesia’s manufacturing sector, and to do that, it becomes important to think back on the history of the country’s industrialization. Thus, this paper is intended to retrace and unlock the track of Indonesia’s industrialization up until the establishment of the manufacturing sector in its present form, with the ultimate goal being to give answers to the above-mentioned questions. Subject to an analysis in this paper is the period from the installment of President Soeharto’s administration onward when industrialization of the modern industrial sector2 moved into high gear. The composition of this paper is outlined below. Section 1 first shows why it is important to examine import substitution and export orientation, both of which are used as the measures of the analysis in this paper, in tracking the history of the industrialization, and then discuss indicators of import substitution and export orientation as well as statistical data and resources needed to develop those indicators. Section 2 clarifies the status of the manufacturing industry among all industries by looking at the composition ratio of the manufacturing industry in terms of value added, imports and exports. Section 3 to 5 cover three periods between 1971 and 1995 and make an analysis of import substitution, export orientation and changes in the industrial structure for each period. Section 3 analyzes the period from 1971 through 1985, when Indonesia pursued the import substitution policy amid the oil boom. Section 4 covers the period from 1985 through 1990, when the packages of deregulatory measures were announced successively under structural adjustment policies made necessary by the fall in oil prices. Section 5 examines the period from 1990 through 1995, which saw the alternate shifts between the overheating of the economy by sharply rising investment by both domestic and foreign investors in the wake of the liberalization of investment, trade and financial services, and polices to cool down the economy. Section 6, which covers the 1995-1999 period straddling the economic crisis, is designed for an analysis of the changes in production trends before and after the economic crisis as well as the changes in the industrial structure. Section 7, after summing up the history of Indonesia’s industrialization examined in the previous sections, discusses problems found in respective sectors and attempts to present future prospects for the country’s manufacturing industry.
Resumo:
The solaR package includes a set of functions to calculate the solar radiation incident on a photovoltaic generator and simulate the performance of several applications of the photovoltaic energy. This package performs the whole calculation procedure from both daily and intradaily global horizontal irradiation to the final productivity of grid connected PV systems and water pumping PV systems. The package stands on a set of S4 classes. The core of each class is a group of slots with yearly, monthly, daily and intradaily multivariate time series (with the zoo package ). The classes share a variety of methods to access the information (for example, as.zooD provides a zoo object with the daily multivariate time series of the corresponding object) and several visualisation methods based on the lattice andlatticeExtra packages.
Resumo:
The problem is general: modern architects and engineers are trying to understand historic structures using the wrong theoretical frame, the classic (elastic) thery of structures developed in the 19th Century for iron and stell, and in the 20th century for reinforced concrete, disguised with "modern" computer packages, mainly FEM, but also others. Masonry is an essentially different material, and the structural equations must be adapted accordingly. It is not a matter of "taste" or "opinion", and the consequences are before us. Since, say 1920s, historic monuments have suffered the aggression of generations of archietcts and engineers, trying to transform masonry in reinfored concrete or steel. The damage to the monuments and the expense has been, and is, enormous. However, as we have an adequate theory (modern limit analysis of masonry structures, Heyman 1966) which encompasses the "old theory" used successfully by the 18th and 19th Century practical engineers (from Perronet to Sejourné), it is a matter of "Ethics" not to use the wrong approach. It is also "contra natura" to modify the material masonry with indiscriminate injections, stitchings, etc. It is insane to consider, suddenly, that buildings which are Centuries or milennia old, are suddenly in danger of collapse. Maintenance is necessary but not the actual destruction of the constructive essence of the monument. A cocktail of "ignorance, fear and greed" is acting under the best of intentions.
Resumo:
A number of data description languages initially designed as standards for trie WWW are currently being used to implement user interfaces to programs. This is done independently of whether such programs are executed in the same or a different host as trie one running the user interface itself. The advantage of this approach is that it provides a portable, standardized, and easy to use solution for the application programmer, and a familiar behavior for the user, typically well versed in the use of WWW browsers. Among the proposed standard description languages, VRML is a aimed at representing three dimensional scenes including hyperlink capabilities. VRML is already used as an import/export format in many 3-D packages and tools, and has been shown effective in displaying complex objects and scenarios. We propose and describe a Prolog library which allows parsing and checking VRML code, transforming it, and writing it out as VRML again. The library converts such code to an internal representation based on first order terms which can then be arbitrarily manipulated. We also present as an example application the use of this library to implement a novel 3-D visualization for examining and understanding certain aspects of the behavior of CLP(FD) programs.
Resumo:
We present and evaluate a compiler from Prolog (and extensions) to JavaScript which makes it possible to use (constraint) logic programming to develop the client side of web applications while being compliant with current industry standards. Targeting JavaScript makes (C)LP programs executable in virtually every modern computing device with no additional software requirements from the point of view of the user. In turn, the use of a very high-level language facilitates the development of high-quality, complex software. The compiler is a back end of the Ciao system and supports most of its features, including its module system and its rich language extension mechanism based on packages. We present an overview of the compilation process and a detailed description of the run-time system, including the support for modular compilation into separate JavaScript code. We demonstrate the maturity of the compiler by testing it with complex code such as a CLP(FD) library written in Prolog with attributed variables. Finally, we validate our proposal by measuring the performance of some LP and CLP(FD) benchmarks running on top of major JavaScript engines.
Resumo:
The Boundary Element Method is a powerful numerical technique well rooted in everyday engineering practice. This is shown by boundary element methods included in the most important commercial computer packages and in the continuous publication of books composed to explain the features of the method to beginners or practicing engineers. Our first paper in Computers & Structures on Boundary Elements was published in 1979 (C & S 10, pp. 351–362), so this Special Issue is for us not only the accomplishment of our obligation to show other colleagues the possibilities of a numerical technique in which we believe, but also the celebration of our particular silver jubilee with this Journal.
Resumo:
Web-based education or „e-learning‟ has become a critical component in higher education for the last decade, replacing other distance learning methods, such as traditional computer training or correspondence learning. The number of university students who take on-line courses is continuously increasing all over the world. In Spain, nearly a 90% of the universities have an institutional e-learning platform and over 60% of the traditional on-site courses use this technology as a supplement to the traditional face-to-face classes. This new form of learning allows the disappearance of geographical barriers and enables students to schedule their own learning process, among some other advantages. On-line education is developed through specific software called „e-learning platform‟ or „virtual learning environment‟ (VLE). A considerable number of web-based tools to deliver distance courses are currently available. Open source software packages such as Moodle, Sakai, dotLRN or Dokeos are the most commonly used in the virtual campuses of Spanish universities. This paper analyzes the possibilities that virtual learning environments provide university teachers and learners and offers a technical comparison among some of the most popular e-learning learning platforms.
Resumo:
Providing QoS in the context of Ad Hoc networks includes a very wide field of application from the perspective of every level of the architecture in the network. Saying It in another way, It is possible to speak about QoS when a network is capable of guaranteeing a trustworthy communication in both extremes, between any couple of the network nodes by means of an efficient Management and administration of the resources that allows a suitable differentiation of services in agreement with the characteristics and demands of every single application.The principal objective of this article is the analysis of the quality parameters of service that protocols of routering reagents such as AODV and DSR give in the Ad Hoc mobile Networks; all of this is supported by the simulator ns-2. Here were going to analyze the behavior of some other parameters like effective channel, loss of packages and latency in the protocols of routering. Were going to show you which protocol presents better characteristics of Quality of Service (QoS) in the MANET networks.
Resumo:
In SSL general illumination, there is a clear trend to high flux packages with higher efficiency and higher CRI addressed with the use of multiple color chips and phosphors. However, such light sources require the optics provide color mixing, both in the near-field and far-field. This design problem is specially challenging for collimated luminaries, in which diffusers (which dramatically reduce the brightness) cannot be applied without enlarging the exit aperture too much. In this work we present first injection molded prototypes of a novel primary shell-shaped optics that have microlenses on both sides to provide Köhler integration. This shell is design so when it is placed on top of an inhomogeneous multichip Lambertian LED, creates a highly homogeneous virtual source (i.e, spatially and angularly mixed), also Lambertian, which is located in the same position with only small increment of the size (about 10-20%, so the average brightness is similar to the brightness of the source). This shell-mixer device is very versatile and permits now to use a lens or a reflector secondary optics to collimate the light as desired, without color separation effects. Experimental measurements have shown optical efficiency of the shell of 95%, and highly homogeneous angular intensity distribution of collimated beams, in good agreement with the ray-tracing simulations.
Resumo:
This research is concerned with the experimental software engineering area, specifically experiment replication. Replication has traditionally been viewed as a complex task in software engineering. This is possibly due to the present immaturity of the experimental paradigm applied to software development. Researchers usually use replication packages to replicate an experiment. However, replication packages are not the solution to all the information management problems that crop up when successive replications of an experiment accumulate. This research borrows ideas from the software configuration management and software product line paradigms to support the replication process. We believe that configuration management can help to manage and administer information from one replication to another: hypotheses, designs, data analysis, etc. The software product line paradigm can help to organize and manage any changes introduced into the experiment by each replication. We expect the union of the two paradigms in replication to improve the planning, design and execution of further replications and their alignment with existing replications. Additionally, this research work will contribute a web support environment for archiving information related to different experiment replications. Additionally, it will provide flexible enough information management support for running replications with different numbers and types of changes. Finally, it will afford massive storage of data from different replications. Experimenters working collaboratively on the same experiment must all have access to the different experiments.
Resumo:
There is no empirical evidence whatsoever to support most of the beliefs on which software construction is based. We do not yet know the adequacy, limits, qualities, costs and risks of the technologies used to develop software. Experimentation helps to check and convert beliefs and opinions into facts. This research is concerned with the replication area. Replication is a key component for gathering empirical evidence on software development that can be used in industry to build better software more efficiently. Replication has not been an easy thing to do in software engineering (SE) because the experimental paradigm applied to software development is still immature. Nowadays, a replication is executed mostly using a traditional replication package. But traditional replication packages do not appear, for some reason, to have been as effective as expected for transferring information among researchers in SE experimentation. The trouble spot appears to be the replication setup, caused by version management problems with materials, instruments, documents, etc. This has proved to be an obstacle to obtaining enough details about the experiment to be able to reproduce it as exactly as possible. We address the problem of information exchange among experimenters by developing a schema to characterize replications. We will adapt configuration management and product line ideas to support the experimentation process. This will enable researchers to make systematic decisions based on explicit knowledge rather than assumptions about replications. This research will output a replication support web environment. This environment will not only archive but also manage experimental materials flexibly enough to allow both similar and differentiated replications with massive experimental data storage. The platform should be accessible to several research groups working together on the same families of experiments.
Resumo:
The solaR package allows for reproducible research both for photovoltaics (PV) systems performance and solar radiation. It includes a set of classes, methods and functions to calculate the sun geometry and the solar radiation incident on a photovoltaic generator and to simulate the performance of several applications of the photovoltaic energy. This package performs the whole calculation procedure from both daily and intradaily global horizontal irradiation to the final productivity of grid-connected PV systems and water pumping PV systems. It is designed using a set of S4 classes whose core is a group of slots with multivariate time series. The classes share a variety of methods to access the information and several visualization methods. In addition, the package provides a tool for the visual statistical analysis of the performance of a large PV plant composed of several systems. Although solaR is primarily designed for time series associated to a location defined by its latitude/longitude values and the temperature and irradiation conditions, it can be easily combined with spatial packages for space-time analysis.