286 resultados para code source
Resumo:
Background/Aim. Mesenchymal stromal cells (MSCs) have been utilised in many clinical trials as an experimental treatment in numerous clinical settings. Bone marrow remains the traditional source tissue for MSCs but is relatively hard to access in large volumes. Alternatively, MSCs may be derived from other tissues including the placenta and adipose tissue. In an initial study no obvious differences in parameters such as cell surface phenotype, chemokine receptor display, mesodermal differentiation capacity or immunosuppressive ability, were detected when we compared human marrow derived- MSCs to human placenta-derived MSCs. The aim of this study was to establish and evaluate a protocol and related processes for preparation placenta-derived MSCs for early phase clinical trials. Methods. A full-term placenta was taken after delivery of the baby as a source of MSCs. Isolation, seeding, incubation, cryopreservation of human placentaderived MSCs and used production release criteria were in accordance with the complex regulatory requirements applicable to Code of Good Manufacturing Practice manufacturing of ex vivo expanded cells. Results. We established and evaluated instructions for MSCs preparation protocol and gave an overview of the three clinical areas application. In the first trial, MSCs were co-transplanted iv to patient receiving an allogeneic cord blood transplant as therapy for treatmentrefractory acute myeloid leukemia. In the second trial, MSCs were administered iv in the treatment of idiopathic pulmonary fibrosis and without serious adverse effects. In the third trial, MSCs were injected directly into the site of tendon damage using ultrasound guidance in the treatment of chronic refractory tendinopathy. Conclusion. Clinical trials using both allogeneic and autologous cells demonstrated MSCs to be safe. A described protocol for human placenta-derived MSCs is appropriate for use in a clinical setting, relatively inexpensive and can be relatively easily adjusted to a different set of regulatory requirements, as applicable to early phase clinical trials.
Resumo:
In this paper, we aim at predicting protein structural classes for low-homology data sets based on predicted secondary structures. We propose a new and simple kernel method, named as SSEAKSVM, to predict protein structural classes. The secondary structures of all protein sequences are obtained by using the tool PSIPRED and then a linear kernel on the basis of secondary structure element alignment scores is constructed for training a support vector machine classifier without parameter adjusting. Our method SSEAKSVM was evaluated on two low-homology datasets 25PDB and 1189 with sequence homology being 25% and 40%, respectively. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies on these two data sets are 86.3% and 84.5%, respectively, which are higher than those obtained by other existing methods. Especially, our method achieves higher accuracies (88.1% and 88.5%) for differentiating the α + β class and the α/β class compared to other methods. This suggests that our method is valuable to predict protein structural classes particularly for low-homology protein sequences. The source code of the method in this paper can be downloaded at http://math.xtu.edu.cn/myphp/math/research/source/SSEAK_source_code.rar.
Resumo:
Widening participation brings with it increasing diversity, increased variation in the level of academic preparedness (Clarke, 2011; Nelson, Clarke, & Kift 2010). Cultural capital coupled with negotiating the academic culture creates an environment based on many assumptions about academic writing and university culture. Variations in staff and student expectations relating to the teaching and learning experience is captured in a range of national and institutional data (AUSSE, CEQ, LEX). Nationally, AUSSE data (2009) indicates that communication, writing, speaking and analytic skills, staff expectations are quite a bit higher than students. The research team noted a recognisable shift in the changing cohort of students and their understanding and engagement with feedback and CRAs, as well as variations in teaching staff and student expectations. The current reality of tutor and student roles is that: - Students self select when/how they access lectures and tutorials. - Shorter tutorial times result in reduced opportunity to develop rapport with students. - CRAs are not always used consistently by staff (different marking styles and levels of feedback). - Marking is not always undertaken by the student’s tutor/lecturer. - Student support services might be recommended to students once a poor grade has been given. Students can perceive this as remedial and a further sense of failure. - CRA sheet has a mark /grade attached to it. Stigma attached to low mark. Hard to focus on the CRA feedback with a poor mark etched next to it. - Limited opportunities for sessionals to access professional development to assist with engaging students and feedback. - FYE resources exist, however academic time is a factor in exploring and embedding these resources. Feedback is another area with differing expectations and understandings. Sadler (2009) contends that students are not equipped to decode the statements properly. For students to be able to apply feedback, they need to understand the meaning of the feedback statement. They also need to identify, the particular aspects of their work that need attention. The proposed Checklist/guide would be one page and submitted with each assessment piece thereby providing an interface to engage students and tutors in managing first year understandings and expectations around CRAs, feedback, and academic practice.
Resumo:
Past research has suggested that social networking sites are the most common source for social engineering-based attacks. Persuasion research shows that people are more likely to obey and accept a message when the source’s presentation appears to be credible. However, many factors can impact the perceived credibility of a source, depending on its type and the characteristics of the environment. Our previous research showed that there are four dimensions of source credibility in terms of social engineering on Facebook: perceived sincerity, perceived competence, perceived attraction, and perceived worthiness. Because the dimensionalities of source credibility as well as their measurement scales can fluctuate from one type of source to another and from one type of context to another, our aim in this study includes validating the existence of those four dimensions toward the credibility of social engineering attackers on Facebook and developing a valid measurement scale for every dimension of them.
Resumo:
Airborne organic pollutants have significant impacts on health; however their sources, atmospheric characteristics and resulting human exposures are poorly understood. This research characterized chemical composition of atmospheric volatile organic compounds, polycyclic aromatic hydrocarbons and carbonyls in representative number of primary schools in Brisbane Metropolitan Area, quantified their concentrations, assessed their toxicity and apportioned them to their sources. The findings expand scientific knowledge of these pollutants, and will contribute towards science based management of risks associated with pollution emissions and air quality in schools and other urban and indoor environments.
Resumo:
Polycyclic Aromatic Hydrocarbons (PAHs) represent a major class of toxic pollutants because of their carcinogenic and mutagenic characteristics. People living in urban areas are regularly exposed to PAHs because of abundance of their emission sources. Within this context, this study aimed to: (i) identify and quantify the levels of ambient PAHs in an urban environment; (ii) evaluate their toxicity; and (iii) identify their sources as well as the contribution of specific sources to measured concentrations. Sixteen PAHs were identified and quantified in air samples collected from Brisbane. Principal Component Analysis – Absolute Principal Component Scores (PCA- APCS) was used in order to conduct source apportionment of the measured PAHs. Vehicular emissions, natural gas combustion, petrol emissions and evaporative/unburned fuel were the sources identified; contributing 56%, 21%, 15% and 8% of the total PAHs emissions, respectively, all of which need to be considered for any pollution control measures implemented in urban areas.
Resumo:
WinBUGS code and data to reproduce our network meta-analysis from "Control strategies to prevent total hip replacement-related infections: a systematic review and mixed treatment comparison" published in BMJ Open.
Resumo:
Over recent decades, efforts have been made to reduce human exposure to atmospheric pollutants including polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) through emission control and abatement. Along with the potential changes in their concentrations resulting from these efforts, profiles of emission sources may have also changed over such extended timeframes. However relevant data are quite limited in the Southern Hemisphere. We revisited two sampling sites in an Australian city, where the concentration data in 1994/5 for atmospheric PAHs and PCBs were available. Monthly air samples from July 2013 to June 2014 at the two sites were collected and analysed for these compounds, using similar protocols to the original study. A prominent seasonal pattern was observed for PAHs with elevated concentrations in cooler months whereas PCB levels showed little seasonal variation. Compared to two decades ago, atmospheric concentrations of ∑13 PAHs (gaseous + particle-associated) in this city have decreased by approximately one order of magnitude and the apparent halving time ( t 1 / 2 ) was estimated as 6.2 ± 0.56 years. ∑6 iPCBs concentrations (median value; gaseous + particle-associated) have decreased by 80% with an estimated t 1 / 2 of 11 ± 2.9 years. These trends and values are similar to those reported for comparable sites in the Northern Hemisphere. To characterise emission source profiles, samples were also collected from a bushfire event and within a vehicular tunnel. Emissions from bushfires are suggested to be an important contributor to the current atmospheric concentrations of PAHs in this city. This contribution is more important in cooler months, i.e. June, July and August, and its importance may have increased over the last two decades.
Resumo:
A computer code is developed for the numerical prediction of natural convection in rectangular two-dimensional cavities at high Rayleigh numbers. The governing equations are retained in the primitive variable form. The numerical method is based on finite differences and an ADI scheme. Convective terms may be approximated with either central or hybrid differencing for greater stability. A non-uniform grid distribution is possible for greater efficiency. The pressure is dealt with via a SIMPLE type algorithm and the use of a fast elliptic solver for the solenoidal velocity correction field significantly reduces computing times. Preliminary results indicate that the code is reasonably accurate, robust and fast compared with existing benchmarks and finite difference based codes, particularly at high Rayleigh numbers. Extension to three-dimensional problems and turbulence studies in similar geometries is readily possible and indicated.
Resumo:
The use of UAVs for remote sensing tasks; e.g. agriculture, search and rescue is increasing. The ability for UAVs to autonomously find a target and perform on-board decision making, such as descending to a new altitude or landing next to a target is a desired capability. Computer-vision functionality allows the Unmanned Aerial Vehicle (UAV) to follow a designated flight plan, detect an object of interest, and change its planned path. In this paper we describe a low cost and an open source system where all image processing is achieved on-board the UAV using a Raspberry Pi 2 microprocessor interfaced with a camera. The Raspberry Pi and the autopilot are physically connected through serial and communicate via MAVProxy. The Raspberry Pi continuously monitors the flight path in real time through USB camera module. The algorithm checks whether the target is captured or not. If the target is detected, the position of the object in frame is represented in Cartesian coordinates and converted into estimate GPS coordinates. In parallel, the autopilot receives the target location approximate GPS and makes a decision to guide the UAV to a new location. This system also has potential uses in the field of Precision Agriculture, plant pest detection and disease outbreaks which cause detrimental financial damage to crop yields if not detected early on. Results show the algorithm is accurate to detect 99% of object of interest and the UAV is capable of navigation and doing on-board decision making.
Resumo:
Corporate executives require relevant and intelligent business information in real-time to take strategic decisions. They require the freedom to access this information anywhere and anytime. There is a need to extend this functionality beyond the office and on the fingertips of the decision makers. Mobile Business Intelligence Tool (MBIT) aims to provide these features in a flexible and cost-efficient manner. This paper describes the detailed architecture of MBIT to overcome the limitations of existing mobile business intelligence tools. Further, a detailed implementation framework is presented to realize the design. This research highlights the benefits of using service oriented architecture to design flexible and platform independent mobile business applications. © 2009 IEEE.
Resumo:
Impervious surfaces in an urban catchment are the primary stormwater pollutant contributing areas. Appropriate treatment of stormwater runoff from these impervious surfaces is essential to safeguard the urban water environment. While urban roads have received significant research attention in this regard, roofs have not been well investigated. Key pollutant processes such as build-up on roads and roofs can be different due to the different surface characteristics. This entails different treatment strategies being needed for road and roofs. The research study characterized roof pollutants build-up by differentiating with road surfaces. It was noted that pollutants are more highly concentrated on particles and particularly finer particles in the case of roof surfaces, compared to road surfaces. Additionally, pollutants built-up on roof surfaces tend to be relatively more variable from one day to another in terms of pollutant loads. These results highlight the significance of roofs as a stormwater pollutant source and the important need for a specific stormwater treatment strategy rather than the application of a combined approach for treating stormwater runoff from both, roads and roofs.
Resumo:
Carrier phase ambiguity resolution over long baselines is challenging in BDS data processing. This is partially due to the variations of the hardware biases in BDS code signals and its dependence on elevation angles. We present an assessment of satellite-induced code bias variations in BDS triple-frequency signals and the ambiguity resolutions procedures involving both geometry-free and geometry-based models. First, since the elevation of a GEO satellite remains unchanged, we propose to model the single-differenced fractional cycle bias with widespread ground stations. Second, the effects of code bias variations induced by GEO, IGSO and MEO satellites on ambiguity resolution of extra-wide-lane, wide-lane and narrow-lane combinations are analyzed. Third, together with the IGSO and MEO code bias variations models, the effects of code bias variations on ambiguity resolution are examined using 30-day data collected over the baselines ranging from 500 to 2600 km in 2014. The results suggest that although the effect of code bias variations on the extra-wide-lane integer solution is almost ignorable due to its long wavelength, the wide-lane integer solutions are rather sensitive to the code bias variations. Wide-lane ambiguity resolution success rates are evidently improved when code bias variations are corrected. However, the improvement of narrow-lane ambiguity resolution is not obvious since it is based on geometry-based model and there is only an indirect impact on the narrow-lane ambiguity solutions.