899 resultados para Generation of 1898


Relevância:

90.00% 90.00%

Publicador:

Resumo:

While it is generally accepted in the learning and teaching literature that assessment is the single biggest influence on how students approach their learning, assessment methods within higher education are generally conservative and inflexible. Constrained by policy and accreditation requirements and the need for the explicit articulation of assessment standards for public accountability purposes, assessment tasks can fail to engage students or reflect the tasks students will face in the world of practice. Innovative assessment design can simultaneously deliver program objectives and active learning through a knowledge transfer process which increases student participation. This social constructivist view highlights that acquiring an understanding of assessment processes, criteria and standards needs active student participation. Within this context, a peer-assessed, weekly, assessment task was introduced in the first “serious” accounting subject offered as part of an undergraduate degree. The positive outcomes of this assessment innovation was that student failure rates declined 15%, tutorial participation increased fourfold, tutorial engagement increased six-fold and there was a 100% approval rating for the retention of the assessment task. In contributing to the core conference theme of “seismic” shifts within higher education, in stark contrast to the positive student response, staff-related issues of assessment conservatism and the necessity of meeting increasing research commitments, threatened the assessment task’s survival. These opposing forces to change have the potential to weaken the ability of higher education assessment arrangements to adequately serve either a new generation of students or the sector's community stakeholders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: The Cobb technique is the universally accepted method for measuring the severity of spinal deformities. Traditionally, Cobb angles have been measured using protractor and pencil on hardcopy radiographic films. The new generation of mobile phones make accurate angle measurement possible using an integrated accelerometer, providing a potentially useful clinical tool for assessing Cobb angles. The purpose of this study was to compare Cobb angle measurements performed using an Apple iPhone and traditional protractor in a series of twenty Adolescent Idiopathic Scoliosis patients. Methods: Seven observers measured major Cobb angles on twenty pre-operative postero-anterior radiographs of Adolescent Idiopathic Scoliosis patients with both a standard protractor and using an Apple iPhone. Five of the observers repeated the measurements at least a week after the original measurements. Results: The mean absolute difference between pairs of iPhone/protractor measurements was 2.1°, with a small (1°) bias toward lower Cobb angles with the iPhone. 95% confidence intervals for intra-observer variability were ±3.3° for the protractor and ±3.9° for the iPhone. 95% confidence intervals for inter-observer variability were ±8.3° for the iPhone and ±7.1° for the protractor. Both of these confidence intervals were within the range of previously published Cobb measurement studies. Conclusions: We conclude that the iPhone is an equivalent Cobb measurement tool to the manual protractor, and measurement times are about 15% less. The widespread availability of inclinometer-equipped mobile phones and the ability to store measurements in later versions of the angle measurement software may make these new technologies attractive for clinical measurement applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The next-generation of service-oriented architecture (SOA) needs to scale for flexible service consumption, beyond organizational and application boundaries, into communities, ecosystems and business networks. In wider and, ultimately, global settings, new capabilities are needed so that business partners can efficiently and reliably enable, adapt and expose services. Those services can then be discovered, ordered, consumed, metered and paid for, through new applications and opportunities, driven by third-parties in the global “village”. This trend is already underway, in different ways, through different early adopter market segments. This paper proposes an architectural strategy for the provisioning and delivery of services in communities, ecosystems and business networks – a Service Delivery Framework (SDF). The SDF is intended to support multiple industries and deployments where a SOA platform is needed for collaborating partners and diverse consumers. Specifically, it is envisaged that the SDF allows providers to publish their services into network directories so that they can be repurposed, traded and consumed, and leveraging network utilities like B2B gateways and cloud hosting. To support these different facets of service delivery, the SDF extends the conventional service provider, service broker and service consumer of the Web Services Architecture to include service gateway, service hoster, service aggregator and service channel maker.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The interaction and relationship between the global warming and the thermal performance buildings are dynamic in nature. In order to model and understand this behavior, different approaches, including keeping weather variable unchanged, morphing approach and diurnal modelling method, have been used to project and generate future weather data. Among these approaches, various assumptions on the change of solar radiation, air humidity and/or wind characteristics may be adopted. In this paper, an example to illustrate the generation of future weather data for the different global warming scenarios in Australia is presented. The sensitivity of building cooling loads to the possible changes of assumed values used in the future weather data generation is investigated. It is shown that with ± 10% change of the proposed future values for solar radiation, air humidity or wind characteristics, the corresponding change in the cooling load of the modeled sample office building at different Australian capital cities would not exceed 6%, 4% and 1.5% respectively. It is also found that with ±10% changes on the proposed weather variables for both the 2070-high future scenario and the current weather scenario, the corresponding change in the cooling loads at different locations may be weaker (up to 2% difference in Hobart for ±10% change in global solar radiation), similar (less than 0.6%) difference in Hobart for ±10% change in wind speed), or stronger (up to 1.6% difference in Hobart for ±10% change in relative humidity) in the 2070-high future scenario than in the current weather scenario.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses and summarises a recent systematic study on the implication of global warming on air conditioned office buildings in Australia. Four areas are covered, including analysis of historical weather data, generation of future weather data for the impact study of global warming, projection of building performance under various global warming scenarios, and evaluation of various adaptation strategies under 2070 high global warming conditions. Overall, it is found that depending on the assumed future climate scenarios and the location considered, the increase of total building energy use for the sample Australian office building may range from 0.4 to 15.1%. When the increase of annual average outdoor temperature exceeds 2 °C, the risk of overheating will increase significantly. However, the potential overheating problem could be completely eliminated if internal load density is significantly reduced.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis examines contemporary mediated spectacles used in regional tourism strategies. In recent years there has been growing occurrence of ‘formatted entertainment models’ in China. With this in mind, this thesis explores the ways in which traditional cultural resources are being converted to generate diverse, hybrid commodities. The unique business model of Zhang Yimou, known as the Impression Series provides the case study. The thesis examines multilayered representations of products which continuously form, and are formatted, under the logic of the cultural market. The case study highlights the revival of traditional Chinese culture, a new branding of the Chinese national image and rising ‘soft power’. Primarily, the thesis argues that personal celebrity endorsement is replacing political propaganda heroes in promoting an alternative image of China. Zhang Yimou and Impression West Lake function as a dual branding mechanism that combines ‘people marketing’ and ‘place marketing’ for the development of a ‘created in China’ cultural commodity as well as for the generation of positive economic outcomes. Secondly, the thesis identifies how natural resources linked with a local tourism industry are articulated into cultural products and how this is experienced by visitors. Culture is a core component of China’s ‘soft power.’ Cultural experience’ strategies such as Impression combine global marketing and local cultural forces. The thesis argues that a creative entrepreneur has more advantages in promoting ‘soft power’ than governmental propaganda strategies. Thirdly, Impression West Lake encapsulates the rise of the creative entrepreneur with the help of local government authorities. Even though government cultural policy-makers can facilitate the cultural infrastructure, they ultimately rely on the entrepreneur’s creative vision and understanding of the market. Finally, based on the study of Impression West Lake, the thesis outlines future opportunities for social, cultural and economic reform in China.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Web 2.0 is a new generation of online applications on the web that permit people to collaborate and share information online. The use of such applications by employees in organisations enhances knowledge management (KM) in organisations. Employee involvement is a critical success factor as the concept is based on openness, engagement and collaboration between people where organizational knowledge is derived from employees experience, skills and best practices. Consequently, the employee's perception is recognized as being an important factor in web 2.0 adoption for KM and worthy of investigation. There are few studies that define and explore employee's enterprise 2.0 acceptance for KM. This paper provides a systematic review of the literature prior to demonstrating the findings as part of a preliminary conceptual model that represents the first stage of an ongoing research project that will end up with an empirical study. Reviewing available studies in technology acceptance, knowledge management and enterprise 2.0 literatures aids obtaining all potential user acceptance factors of enterprise 2.0. The preliminary conceptual model is a refinement of the theory of planed behaviour (TPB) as the user acceptance factors has been mapped into the TPB main components including behaviour attitude, subjective norms and behaviour control which are the determinant of individual's intention to a particular behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this article is to describe a project with one Torres Strait Islander Community. It provides some insights into parents’ funds of knowledge that are mathematical in nature, such as sorting shells and giving fish. The idea of funds of knowledge is based on the premise that people are competent and have knowledge that has been historically and culturally accumulated into a body of knowledge and skills essential for their functioning and well-being. This knowledge is then practised throughout their lives and passed onto the next generation of children. Through adopting a community research approach, funds of knowledge that can be used to validate the community’s identities as knowledgeable people, can also be used as foundations for future learnings for teachers, parents and children in the early years of school. They can be the bridge that joins a community’s funds of knowledge with schools validating that knowledge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mesenchymal stem cells (MSCs) are undifferentiated, multi-potent stem cells with the ability to renew. They can differentiate into many types of terminal cells, such as osteoblasts, chondrocytes, adipocytes, myocytes, and neurons. These cells have been applied in tissue engineering as the main cell type to regenerate new tissues. However, a number of issues remain concerning the use of MSCs, such as cell surface markers, the determining factors responsible for their differentiation to terminal cells, and the mechanisms whereby growth factors stimulate MSCs. In this chapter, we will discuss how proteomic techniques have contributed to our current knowledge and how they can be used to address issues currently facing MSC research. The application of proteomics has led to the identification of a special pattern of cell surface protein expression of MSCs. The technique has also contributed to the study of a regulatory network of MSC differentiation to terminal differentiated cells, including osteocytes, chondrocytes, adipocytes, neurons, cardiomyocytes, hepatocytes, and pancreatic islet cells. It has also helped elucidate mechanisms for growth factor–stimulated differentiation of MSCs. Proteomics can, however, not reveal the accurate role of a special pathway and must therefore be combined with other approaches for this purpose. A new generation of proteomic techniques have recently been developed, which will enable a more comprehensive study of MSCs. Keywords

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Programmed cell death (PCD) and progenitor cell generation (of glial and in some brain areas also neuronal fate) in the CNS is an active process throughout life and is generally not associated with gliosis which means that PCD can be pathologically silent. The striking discovery that progenitor cell generation (of glial and in some brain areas neuronal fate) is widespread in the adult CNS (especially the hippocampus) suggest a much more dynamic scenario than previously thought and transcends the dichotomy between neurodevelopmental and neurodegenerative models of schizophrenia and related disorders. We suggest that the regulatory processes that control the regulation of PCD and the generation of progenitor cells may be disturbed in the early phase of psychotic disorders underpinning a disconnectivity syndrom at the onset of clinically overt disorders. An ongoing 1H-MRS study of the anterior hippocampus at 3 Tesla in mostly drug-naive first-episode psychosis patients suggests no change in NAA, but significant increases in myo-inositol and lactate. The data suggests that neuronal integrity in the anterior hippocampus is still intact at the early stage of illness or mainly only functionally impaired. However the increase in lactate and myo-inositol may reflect a potential disturbance of generation and PCD of progenitor cells (of glial and in selected brain areas also neuronal fate) at the onset of psychosis. If true the use of neuroprotective agents such as lithium or eicosapentaenoic acid (which inhibit PCD and support cell generation)in the early phase of psychotic disorders may be a potent treatment avenue to explore.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work focuses on the development of a stand-alone gas nanosensor node, powered by solar energy to track concentration of polluted gases such as NO2, N2O, and NH3. Gas sensor networks have been widely developed over recent years, but the rise of nanotechnology is allowing the creation of a new range of gas sensors [1] with higher performance, smaller size and an inexpensive manufacturing process. This work has created a gas nanosensor node prototype to evaluate future field performance of this new generation of sensors. The sensor node has four main parts: (i) solar cells; (ii) control electronics; (iii) gas sensor and sensor board interface [2-4]; and (iv) data transmission. The station is remotely monitored through wired (ethernet cable) or wireless connection (radio transmitter) [5, 6] in order to evaluate, in real time, the performance of the solar cells and sensor node under different weather conditions. The energy source of the node is a module of polycrystalline silicon solar cells with 410cm2 of active surface. The prototype is equipped with a Resistance-To-Period circuit [2-4] to measure the wide range of resistances (KΩ to GΩ) from the sensor in a simple and accurate way. The system shows high performance on (i) managing the energy from the solar panel, (ii) powering the system load and (iii) recharging the battery. The results show that the prototype is suitable to work with any kind of resistive gas nanosensor and provide useful data for future nanosensor networks.