830 resultados para blended workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La utilización de recursos electrónicos como el Adobe® Connect T.M. 8 en un tipo de aprendizaje denominado mixto es de notable importancia práctica y no sólo respecto de la docencia de las asignaturas de Grado, sino también para aquellas otras que se encuentran en proceso de extinción y de las que no se imparte docencia a los alumnos que no las han superado todavía. Más aún si se tiene presente que el B- Learning o aprendizaje mixto hace que el docente no sólo continúe ejerciendo su papel como formador tradicional, sino que también utilice en su proceso el material didáctico que la informática e Internet, en particular, le proporcionan. Convirtiéndose de este modo en tutor on line y formador mediante las clases presenciales

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Students may have difficulty in understanding some of the complex concepts which they have been taught in the general areas of science and engineering. Whilst practical work such as a laboratory based examination of the performance of structures has an important role in knowledge construction this does have some limitations. Blended learning supports different learning styles, hence further benefits knowledge building. This research involves an empirical study of how vodcasts (video-podcasts) can be used to enrich learning experience in the structural properties of materials laboratory of an undergraduate course. Students were given the opportunity of downloading and viewing the vodcasts on the theory before and after the experimental work. It is the choice of the students when (before or after, before and after) and how many times they would like to view the vodcasts. In blended learning, the combination of face-to-face teaching, vodcasts, printed materials, practical experiments, writing reports and instructors’ feedbacks benefits different learning styles of the learners. For the preparation of the practical, the students were informed about the availability of the vodcasts prior to the practical session. After the practical work, students submitted an individual laboratory report for the assessment of the structures laboratory. The data collection consisted of a questionnaire completed by the students, follow-up semi-structured interviews and the practical reports submitted by them for assessment. The results from the questionnaire were analysed quantitatively, whilst the data from the assessment reports were analysed qualitatively. The analysis shows that most of the students who have not fully grasped the theory after the practical, managed to gain the required knowledge by viewing the vodcasts. According to their feedbacks, the students felt that they have control over how to use the material and to view it as many times as they wish. Some students who have understood the theory may choose to view it once or not at all. Their understanding was demonstrated by their explanations in their reports, and was illustrated by the approach they took to explicate the results of their experimental work. The research findings are valuable to instructors who design, develop and deliver different types of blended learning, and are beneficial to learners who try different blended approaches. Recommendations were made on the role of the innovative application of vodcasts in the knowledge construction for structures laboratory and to guide future work in this area of research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effect of poultry species (broiler or turkey) and genotype (Wrolstad or BUT T8 turkeys and Ross 308 or Cobb 500 broilers) on the efficiency with which dietary longchain n-3 PUFA were incorporated into poultry meat was determined. Broilers and turkeys of both genotypes were fed one of six diets varying in FA composition (two replicates per genotype x diet interaction). Diets contained 50 g/kg added oil, which was either blended vegetable oil (control), or partially replaced with linseed oil (20 or 40 g/kg diet), fish oil (20 or 40 g/kg diet), or a mixture of the two (20 g linseed oil and 20 g fish oil/kg diet). Feeds and samples of skinless breast and thigh meat were analyzed for FA. Wrolstad dark meat was slightly more responsive than BUT T8 (P = 0.046) to increased dietary 18:3 concentrations (slopes of 0.570 and 0.465, respectively). The Ross 308 was also slightly more responsive than the Cobb 500 (P= 0.002) in this parameter (slopes of 0.557 and 0.449). There were no other significant differences between the genotypes. There was some evidence (based on the estimates of the slopes and their associated standard errors) that white turkey meat was more responsive than white chicken meat to 20:5 (slopes of 0.504 and 0.289 for turkeys and broilers, respectively). There was no relationship between dietary 18:3 n-3 content and meat 20:5 and 22:6 contents. If birds do convert 18:3 to higher FA, these acids are not then deposited in the edible tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stable isotope labeling combined with MS is a powerful method for measuring relative protein abundances, for instance, by differential metabolic labeling of some or all amino acids with 14N and 15N in cell culture or hydroponic media. These and most other types of quantitative proteomics experiments using high-throughput technologies, such as LC-MS/MS, generate large amounts of raw MS data. This data needs to be processed efficiently and automatically, from the mass spectrometer to statistically evaluated protein identifications and abundance ratios. This paper describes in detail an approach to the automated analysis of uniformly 14N/15N-labeled proteins using MASCOT peptide identification in conjunction with the trans-proteomic pipeline (TPP) and a few scripts to integrate the analysis workflow. Two large proteomic datasets from uniformly labeled Arabidopsis thaliana were used to illustrate the analysis pipeline. The pipeline can be fully automated and uses only common or freely available software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stable isotope labeling combined with MS is a powerful method for measuring relative protein abundances, for instance, by differential metabolic labeling of some or all amino acids with N-14 and N-15 in cell culture or hydroponic media. These and most other types of quantitative proteomics experiments using high-throughput technologies, such as LC-MS/MS, generate large amounts of raw MS data. This data needs to be processed efficiently and automatically, from the mass spectrometer to statistically evaluated protein identifications and abundance ratios. This paper describes in detail an approach to the automated analysis of Uniformly N-14/N-15-labeled proteins using MASCOT peptide identification in conjunction with the trans-proteomic pipeline (TPP) and a few scripts to integrate the analysis workflow. Two large proteomic datasets from uniformly labeled Arabidopsis thaliana were used to illustrate the analysis pipeline. The pipeline can be fully automated and uses only common or freely available software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web service composition can be facilitated by an automatic process which consists of rules, conditions and actions. This research has adapted ElementaryPetri Net (EPN) to analyze and model the web services and their composition. This paper describes a set of techniques for representing transition rules, algorithm and workflow that web service composition can be automatically carried out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technology-enhanced or Computer Aided Learning (e-learning) can be institutionally integrated and supported by learning management systems or Virtual Learning Environments (VLEs) to offer efficiency gains, effectiveness and scalability of the e-leaning paradigm. However this can only be achieved through integration of pedagogically intelligent approaches and lesson preparation tools environment and VLE that is well accepted by both the students and teachers. This paper critically explores some of the issues relevant to scalable routinisation of e-learning at the tertiary level, typically first year university undergraduates, with the teaching of Relational Data Analysis (RDA), as supported by multimedia authoring, as a case study. The paper concludes that blended learning approaches which balance the deployment of e-learning with other modalities of learning delivery such as instructor–mediated group learning etc offer the most flexible and scalable route to e-learning but that this requires the graceful integration of platforms for multimedia production, distribution and delivery through advanced interactive spaces that provoke learner engagement and promote learning autonomy and group learning facilitated by a cooperative-creative learning environment that remains open to personal exploration of constructivist-constructionist pathways to learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grid portals are increasingly used to provide uniform access to the grid infrastructure. This paper describes how the P-GRADE Grid Portal could be used in a collaborative manner to facilitate group work and support the notion of Virtual Organisations. We describe the development issues involved in the construction of a collaborative portal, including ensuring a consistent view between participants of a collaborative workflow and management of proxy credentials to allow separate nodes of the workflow to be submitted to different grids.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the rapid development of proteomics, a number of different methods appeared for the basic task of protein identification. We made a simple comparison between a common liquid chromatography-tandem mass spectrometry (LC-MS/MS) workflow using an ion trap mass spectrometer and a combined LC-MS and LC-MS/MS method using Fourier transform ion cyclotron resonance (FTICR) mass spectrometry and accurate peptide masses. To compare the two methods for protein identification, we grew and extracted proteins from E. coli using established protocols. Cystines were reduced and alkylated, and proteins digested by trypsin. The resulting peptide mixtures were separated by reversed-phase liquid chromatography using a 4 h gradient from 0 to 50% acetonitrile over a C18 reversed-phase column. The LC separation was coupled on-line to either a Bruker Esquire HCT ion trap or a Bruker 7 tesla APEX-Qe Qh-FTICR hybrid mass spectrometer. Data-dependent Qh-FTICR-MS/MS spectra were acquired using the quadrupole mass filter and collisionally induced dissociation into the external hexapole trap. Proteins were in both schemes identified by Mascot MS/MS ion searches and the peptides identified from these proteins in the FTICR MS/MS data were used for automatic internal calibration of the FTICR-MS data, together with ambient polydimethylcyclosiloxane ions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human consumption of long-chain n-3 polyunsaturated fatty acids (LC n-3 PUFA) is below recommendations, and enriching chicken meat (by incorporating LC n-3 PUFA into broiler diets) is a viable means of increasing consumption. Fish oil is the most common LC n-3 PUFA supplement used but is unsustainable and reduces the oxidative stability of the meat. The objective of this experiment was to compare fresh fish oil (FFO) with fish oil encapsulated (EFO) in a gelatin matrix (to maintain its oxidative stability) and algal biomass at a low (LAG, 11), medium (MAG, 22), or high (HAG, 33 g/kg of diet) level of inclusion. The C22:6n-3 contents of the FFO, EFO, and MAG diets were equal. A control (CON) diet using blended vegetable oil was also made. As-hatched 1-d-old Ross 308 broilers (144) were reared (21 d) on a common starter diet then allocated to treatment pens (4 pens per treatment, 6 birds per pen) and fed treatment diets for 21 d before being slaughtered. Breast and leg meat was analyzed (per pen) for fatty acids, and cooked samples (2 pens per treatment) were analyzed for volatile aldehydes. Concentrations (mg/100 g of meat) of C20:5n-3, C22:5n-3, and C22:6n-3 were (respectively) CON: 4, 15, 24; FFO: 31, 46, 129; EFO: 18, 27, 122; LAG: 9, 19, 111; MAG: 6, 16, 147; and HAG: 9, 14, 187 (SEM: 2.4, 3.6, 13.1) in breast meat and CON: 4, 12, 9; FFO: 58, 56, 132; EFO: 63, 49, 153; LAG: 13, 14, 101; MAG: 11, 15, 102; HAG: 37, 37, 203 (SEM: 7.8, 6.7, 14.4) in leg meat. Cooked EFO and HAG leg meat was more oxidized (5.2 mg of hexanal/kg of meat) than the other meats (mean 2.2 mg/kg, SEM 0.63). It is concluded that algal biomass is as effective as fish oil at enriching broiler diets with C22:6 LC n-3 PUFA, and at equal C22:6n-3 contents, there is no significant difference between these 2 supplements on the oxidative stability of the meat that is produced.