892 resultados para multi-platform development
Resumo:
This paper describes an experiment to explore the effects of the TENCompetence infrastructure for supporting lifelong competence development which is now in development. This infrastructure provides structured, multi-leveled access to learning materials, based upon competences. People can follow their own learning path, supported by a listing of competences and their components, by competence development plans attached to competences and by the possibility to mark elements as complete. We expected the PCM to have an effect on (1) control of participants of their own learning, and (2) appreciation of their learning route, (3) of the learning resources, (4) of their competence development, and (5) of the possibilities of collaboration. In the experiment, 44 Bulgarian teachers followed a distance learning course on a specific teaching methodology for six weeks. Part of them used the TENCompetence infrastructure, part used an infrastructure which was similar, except for the characterizing elements mentioned above. The results showed that in the experimental condition, more people passed the final competence assess-ment, and people felt more in control of their own learning. No differences between the two groups were found on the amount and appreciation of collaboration and on further measures of competence development.
Resumo:
The objective of the PANACEA ICT-2007.2.2 EU project is to build a platform that automates the stages involved in the acquisition,production, updating and maintenance of the large language resources required by, among others, MT systems. The development of a Corpus Acquisition Component (CAC) for extracting monolingual and bilingual data from the web is one of the most innovative building blocks of PANACEA. The CAC, which is the first stage in the PANACEA pipeline for building Language Resources, adopts an efficient and distributed methodology to crawl for web documents with rich textual content in specific languages and predefined domains. The CAC includes modules that can acquire parallel data from sites with in-domain content available in more than one language. In order to extrinsically evaluate the CAC methodology, we have conducted several experiments that used crawled parallel corpora for the identification and extraction of parallel sentences using sentence alignment. The corpora were then successfully used for domain adaptation of Machine Translation Systems.
Resumo:
Evaluating the possible benefits of the introduction of genetically modified (GM) crops must address the issue of consumer resistance as well as the complex regulation that has ensued. In the European Union (EU) this regulation envisions the “co-existence” of GM food with conventional and quality-enhanced products, mandates the labelling and traceability of GM products, and allows only a stringent adventitious presence of GM content in other products. All these elements are brought together within a partial equilibrium model of the EU agricultural food sector. The model comprises conventional, GM and organic food. Demand is modelled in a novel fashion, whereby organic and conventional products are treated as horizontally differentiated but GM products are vertically differentiated (weakly inferior) relative to conventional ones. Supply accounts explicitly for the land constraint at the sector level and for the need for additional resources to produce organic food. Model calibration and simulation allow insights into the qualitative and quantitative effects of the large-scale introduction of GM products in the EU market. We find that the introduction of GM food reduces overall EU welfare, mostly because of the associated need for costly segregation of non-GM products, but the producers of quality-enhanced products actually benefit.
Resumo:
Adjuvants are increasingly used by the vaccine research and development community, particularly for their ability to enhance immune responses and for their dose-sparing properties. However, they are not readily available to the majority of public sector vaccine research groups, and even those with access to suitable adjuvants may still fail in the development of their vaccines because of lack of knowledge on how to correctly formulate the adjuvants. This shortcoming led the World Health Organization to advocate for the establishment of the Vaccine Formulation Laboratory at the University of Lausanne, Switzerland. The primary mission of the laboratory is to transfer adjuvants and formulation technology free of intellectual property rights to academic institutions, small biotechnology companies and developing countries vaccine manufacturers. In this context, the transfer of an oil-in-water emulsion to Bio Farma, an Indonesian vaccine manufacturer, was initiated to increase domestic pandemic influenza vaccine production capacity as part of the national pandemic influenza preparedness plan.
Resumo:
The old, understudied electoral system composed of multi-member districts, open ballot and plurality rule is presented as the most remote scene of the origin of both political parties and new electoral systems. A survey of the uses of this set of electoral rules in different parts of the world during remote and recent periods shows its wide spread. A model of voting by this electoral system demonstrates that, while it can produce varied and pluralistic representation, it also provides incentives to form factional or partisan candidacies. Famous negative reactions to the emergence of factions and political parties during the 18th and 19th centuries are reinterpreted in this context. Many electoral rules and procedures invented since the second half of the 19th century, including the Australian ballot, single-member districts, limited and cumulative ballots, and proportional representation rules, derived from the search to reduce the effects of the originating multi-member district system in favor of a single party sweep. The general relations between political parties and electoral systems are restated to account for the foundational stage here discussed.
Resumo:
A high-resolution three-dimensional (3D) seismic reflection system for small-scale targets in lacustrine settings has been developed. Its main characteristics include navigation and shot-triggering software that fires the seismic source at regular distance intervals (max. error of 0.25 m) with real-time control on navigation using differential GPS (Global Positioning System). Receiver positions are accurately calculated (error < 0.20 m) with the aid of GPS antennas attached to the end of each of three 24-channel streamers. Two telescopic booms hold the streamers at a distance of 7.5 m from each other. With a receiver spacing of 2.5 m, the bin dimension is 1.25 m in inline and 3.75 m in crossline direction. To test the system, we conducted a 3D survey of about 1 km(2) in Lake Geneva, Switzerland, over a complex fault zone. A 5-m shot spacing resulted in a nominal fold of 6. A double-chamber bubble-cancelling 15/15 in(3) air gun (40-650 Hz) operated at 80 bars and 1 m depth gave a signal penetration of 300 m below water bottom and a best vertical resolution of 1.1 m. Processing followed a conventional scheme, but had to be adapted to the high sampling rates, and our unconventional navigation data needed conversion to industry standards. The high-quality data enabled us to construct maps of seismic horizons and fault surfaces in three dimensions. The system proves to be well adapted to investigate complex structures by providing non-aliased images of reflectors with dips up to 30 degrees.
Resumo:
Drilled shafts have been used in the US for more than 100 years in bridges and buildings as a deep foundation alternative. For many of these applications, the drilled shafts were designed using the Working Stress Design (WSD) approach. Even though WSD has been used successfully in the past, a move toward Load Resistance Factor Design (LRFD) for foundation applications began when the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000.The policy memorandum requires all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. This ensures compatibility between the superstructure and substructure designs, and provides a means of consistently incorporating sources of uncertainty into each load and resistance component. Regionally-calibrated LRFD resistance factors are permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy and competitiveness of drilled shafts. To achieve this goal, a database for Drilled SHAft Foundation Testing (DSHAFT) has been developed. DSHAFT is aimed at assimilating high quality drilled shaft test data from Iowa and the surrounding regions, and identifying the need for further tests in suitable soil profiles. This report introduces DSHAFT and demonstrates its features and capabilities, such as an easy-to-use storage and sharing tool for providing access to key information (e.g., soil classification details and cross-hole sonic logging reports). DSHAFT embodies a model for effective, regional LRFD calibration procedures consistent with PIle LOad Test (PILOT) database, which contains driven pile load tests accumulated from the state of Iowa. PILOT is now available for broader use at the project website: http://srg.cce.iastate.edu/lrfd/. DSHAFT, available in electronic form at http://srg.cce.iastate.edu/dshaft/, is currently comprised of 32 separate load tests provided by Illinois, Iowa, Minnesota, Missouri and Nebraska state departments of transportation and/or department of roads. In addition to serving as a manual for DSHAFT and providing a summary of the available data, this report provides a preliminary analysis of the load test data from Iowa, and will open up opportunities for others to share their data through this quality–assured process, thereby providing a platform to improve LRFD approach to drilled shafts, especially in the Midwest region.
Resumo:
Culverts are common means to convey flow through the roadway system for small streams. In general, larger flows and road embankment heights entail the use of multibarrel culverts (a.k.a. multi-box) culverts. Box culverts are generally designed to handle events with a 50-year return period, and therefore convey considerably lower flows much of the time. While there are no issues with conveying high flows, many multi-box culverts in Iowa pose a significant problem related to sedimentation. The highly erosive Iowa soils can easily lead to the situation that some of the barrels can silt-in early after their construction, becoming partially filled with sediment in few years. Silting can reduce considerably the capacity of the culvert to handle larger flow events. Phase I of this Iowa Highway Research Board project (TR-545) led to an innovative solution for preventing sedimentation. The solution was comprehensively investigated through laboratory experiments and numerical modeling aimed at screening design alternatives and testing their hydraulic and sediment conveyance performance. Following this study phase, the Technical Advisory Committee suggested to implement the recommended sediment mitigation design to a field site. The site selected for implementation was a 3-box culvert crossing Willow Creek on IA Hwy 1W in Iowa City. The culvert was constructed in 1981 and the first cleanup was needed in 2000. Phase II of the TR 545 entailed the monitoring of the site with and without the selfcleaning sedimentation structure in place (similarly with the study conducted in laboratory). The first monitoring stage (Sept 2010 to December 2012) was aimed at providing a baseline for the operation of the as-designed culvert. In order to support Phase II research, a cleanup of the IA Hwy 1W culvert was conducted in September 2011. Subsequently, a monitoring program was initiated to document the sedimentation produced by individual and multiple storms propagating through the culvert. The first two years of monitoring showed inception of the sedimentation in the first spring following the cleanup. Sedimentation continued to increase throughout the monitoring program following the depositional patterns observed in the laboratory tests and those documented in the pre-cleaning surveys. The second part of Phase II of the study was aimed at monitoring the constructed self-cleaning structure. Since its construction in December 2012, the culvert site was continuously monitored through systematic observations. The evidence garnered in this phase of the study demonstrates the good performance of the self-cleaning structure in mitigating the sediment deposition at culverts. Besides their beneficial role in sediment mitigation, the designed self-cleaning structures maintain a clean and clear area upstream the culvert, keep a healthy flow through the central barrel offering hydraulic and aquatic habitat similar with that in the undisturbed stream reaches upstream and downstream the culvert. It can be concluded that the proposed self-cleaning structural solution “streamlines” the area upstream the culvert in a way that secures the safety of the culvert structure at high flows while producing much less disturbance in the stream behavior compared with the current constructive approaches.
Resumo:
The Iowa Department of Transportation (IDOT) has been requiring Critical Path Method (CPM) schedules on some larger or more schedule sensitive projects. The Office of Construction's expectations for enhanced project control and improved communication of project objectives have not been fully met by the use of CPM. Recognizing that the current procedures might not be adequate for all projects, IDOT sponsored a research project to explore the state-of-the-art in transportation scheduling and identify opportunities for improvement. The first phase of this project identified a technique known as the Linear Scheduling Method (LSM) as an alternative to CPM on certain highway construction projects. LSM graphically displays the construction process with respect to the location and the time in which each activity occurs. The current phase of this project was implemented to allow the research team the opportunity to evaluate LSM on all small groups of diverse projects. Unlike the first phase of the project, the research team was closely involved in the project from early in the planning phase throughout the completion of the projects. The research strongly suggests that the linear scheduling technique has great potential as a project management tool for both contractors and IDOT personnel. However, before this technique can become a viable weapon in the project management arsenal, a software application needs to be developed. This application should bring to linear scheduling a degree of functionality as rich and as comprehensive as that found in microcomputer based CPM software on the market today. The research team recommends that the IDOT extend this research effort to include the development of a linear scheduling application.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
BACKGROUND: Whether nucleoside reverse transcriptase inhibitors increase the risk of myocardial infarction in HIV-infected individuals is unclear. Our aim was to explore whether exposure to such drugs was associated with an excess risk of myocardial infarction in a large, prospective observational cohort of HIV-infected patients. METHODS: We used Poisson regression models to quantify the relation between cumulative, recent (currently or within the preceding 6 months), and past use of zidovudine, didanosine, stavudine, lamivudine, and abacavir and development of myocardial infarction in 33 347 patients enrolled in the D:A:D study. We adjusted for cardiovascular risk factors that are unlikely to be affected by antiretroviral therapy, cohort, calendar year, and use of other antiretrovirals. FINDINGS: Over 157,912 person-years, 517 patients had a myocardial infarction. We found no associations between the rate of myocardial infarction and cumulative or recent use of zidovudine, stavudine, or lamivudine. By contrast, recent-but not cumulative-use of abacavir or didanosine was associated with an increased rate of myocardial infarction (compared with those with no recent use of the drugs, relative rate 1.90, 95% CI 1.47-2.45 [p=0.0001] with abacavir and 1.49, 1.14-1.95 [p=0.003] with didanosine); rates were not significantly increased in those who stopped these drugs more than 6 months previously compared with those who had never received these drugs. After adjustment for predicted 10-year risk of coronary heart disease, recent use of both didanosine and abacavir remained associated with increased rates of myocardial infarction (1.49, 1.14-1.95 [p=0.004] with didanosine; 1.89, 1.47-2.45 [p=0.0001] with abacavir). INTERPRETATION: There exists an increased risk of myocardial infarction in patients exposed to abacavir and didanosine within the preceding 6 months. The excess risk does not seem to be explained by underlying established cardiovascular risk factors and was not present beyond 6 months after drug cessation.
Resumo:
This paper describes the result of a research about diverse areas of the information technology world applied to cartography. Its final result is a complete and custom geographic information web system, designed and implemented to manage archaeological information of the city of Tarragona. The goal of the platform is to show on a web-focused application geographical and alphanumerical data and to provide concrete queries to explorate this. Various tools, between others, have been used: the PostgreSQL database management system in conjunction with its geographical extension PostGIS, the geographic server GeoServer, the GeoWebCache tile caching, the maps viewer and maps and satellite imagery from Google Maps, locations imagery from Google Street View, and other open source libraries. The technology has been chosen from an investigation of the requirements of the project, and has taken great part of its development. Except from the Google Maps tools which are not open source but are free, all design has been implemented with open source and free tools.
Resumo:
Aquest treball final de carrera es basa en la creació d'una borsa de treball on-line, distribuïda i multi-dispositiu. Ha estat creada a partir de noves tecnologies com Play Framework i Twiter Bootstrap, utilitzant els llenguatges Java i Scala, usant marcatge HTML5 i desplegada en un servidor de cloud computing anomenat Heroku.
Resumo:
The main function of a roadway culvert is to effectively convey drainage flow during normal and extreme hydrologic conditions. This function is often impaired due to the sedimentation blockage of the culvert. This research sought to understand the mechanics of sedimentation process at multi-box culverts, and develop self-cleaning systems that flush out sediment deposits using the power of drainage flows. The research entailed field observations, laboratory experiments, and numerical simulations. The specific role of each of these investigative tools is summarized below: a) The field observations were aimed at understanding typical sedimentation patterns and their dependence on culvert geometry and hydrodynamic conditions during normal and extreme hydrologic events. b) The laboratory experiments were used for modeling sedimentation process observed insitu and for testing alternative self-cleaning concepts applied to culverts. The major tasks for the initial laboratory model study were to accurately replicate the culvert performance curves and the dynamics of sedimentation process, and to provide benchmark data for numerical simulation validation. c) The numerical simulations enhanced the understanding of the sedimentation processes and aided in testing flow cases complementary to those conducted in the model reducing the number of (more expensive) tests to be conducted in the laboratory. Using the findings acquired from the laboratory and simulation works, self-cleaning culvert concepts were developed and tested for a range of flow conditions. The screening of the alternative concepts was made through experimental studies in a 1:20 scale model guided by numerical simulations. To ensure the designs are effective, performance studies were finally conducted in a 1:20 hydraulic model using the most promising design alternatives to make sure that the proposed systems operate satisfactory under closer to natural scale conditions.
Resumo:
The present study is an integral part of a broader study focused on the design and implementation of self-cleaning culverts, i.e., configurations that prevent the formation of sediment deposits after culvert construction or cleaning. Sediment deposition at culverts is influenced by many factors, including the size and characteristics of material of which the channel is composed, the hydraulic characteristics generated under different hydrology events, the culvert geometry design, channel transition design, and the vegetation around the channel. The multitude of combinations produced by this set of variables makes the investigation of practical situations a complex undertaking. In addition to the considerations above, the field and analytical observations have revealed flow complexities affecting the flow and sediment transport through culverts that further increase the dimensions of the investigation. The flow complexities investigated in this study entail: flow non-uniformity in the areas of transition to and from the culvert, flow unsteadiness due to the flood wave propagation through the channel, and the asynchronous correlation between the flow and sediment hydrographs resulting from storm events. To date, the literature contains no systematic studies on sediment transport through multi-box culverts or investigations on the adverse effects of sediment deposition at culverts. Moreover, there is limited knowledge about the non-uniform, unsteady sediment transport in channels of variable geometry. Furthermore, there are few readily useable (inexpensive and practical) numerical models that can reliably simulate flow and sediment transport in such complex situations. Given the current state of knowledge, the main goal of the present study is to investigate the above flow complexities in order to provide the needed insights for a series of ongoing culvert studies. The research was phased so that field observations were conducted first to understand the culvert behavior in Iowa landscape. Modeling through complementary hydraulic model and numerical experiments was subsequently carried out to gain the practical knowledge for the development of the self-cleaning culvert designs.