933 resultados para Riemann-Liouville Derivative


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Homologous recombinational repair is an essential mechanism for repair of double-strand breaks in DNA. Recombinases of the RecA-fold family play a crucial role in this process, forming filaments that utilize ATP to mediate their interactions with singleand double-stranded DNA. The recombinase molecules present in the archaea (RadA) and eukaryota (Rad51) are more closely related to each other than to their bacterial counterpart (RecA) and, as a result, RadA makes a suitable model for the eukaryotic system. The crystal structure of Sulfolobus solfataricus RadA has been solved to a resolution of 3.2 A° in the absence of nucleotide analogues or DNA, revealing a narrow filamentous assembly with three molecules per helical turn. As observed in other RecA-family recombinases, each RadA molecule in the filament is linked to its neighbour via interactions of a short b-strand with the neighbouring ATPase domain. However, despite apparent flexibility between domains, comparison with other structures indicates conservation of a number of key interactions that introduce rigidity to the system, allowing allosteric control of the filament by interaction with ATP. Additional analysis reveals that the interaction specificity of the five human Rad51 paralogues can be predicted using a simple model based on the RadA structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The title compound, C18H12N6O6 was prepared from the reaction of 4-(phenyldiazenyl)aniline (aniline yellow) with picrylsulfonic acid. The dihedral angle formed by the two benzene rings of the diphenyldiazenyl ring system 6.55(13)deg. and that formed by the rings of the picrate-aniline ring system is 48.76(12)deg. The molecule contains an intramolecular aniline-nitro N-H...O hydrogen bond.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building insulation is often used to reduce the conduction heat transfer through building envelope. With a higher level of insulation (or a greater R-value), the less the conduction heat would transfer through building envelope. In this paper, using building computer simulation techniques, the effects of building insulation levels on the thermal and energy performance of a sample air-conditioned office building in Australia are studied. It is found that depending on the types of buildings and the climates of buildings located, increasing the level of building insulation will not always bring benefits in energy saving and thermal comfort, particularly for internal-load dominated office buildings located in temperate/tropical climates. The possible implication of building insulation in face of global warming has also been examined. Compared with the influence of insulation on building thermal performance, the influence on building energy use is relatively small.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sustainable practices are more than ever on the radar screen of organizations, triggered by a growing demand of the wider population towards approaches and practices that can be considered "green" or "sustainable". Our specific intent with this call for action is to immerse deeper into the role of business processes, and specifically the contributions that the management of these processes can play in leveraging the transformative power of information systems (IS) in order to create environmentally sustainable organizations. Our key premise is that business and information technology (IT) managers need to engage in a process-focused discussion to enable a common, comprehensive understanding of process, and the process-centered opportunities for making these processes, and ultimately the organization as a process-centric entity, "green". Based on a business process lifecycle model, we propose possible avenues for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the prominent topics in Business Service Management is business models for (new) services. Business models are useful for service management and engineering as they provide a broader and more holistic perspective on services. Business models are particularly relevant for service innovation as this requires paying attention to the business models that make new services viable and business model innovation can drive the innovation of new and established services. Before we can have a look at business models for services, we first need to understand what business models are. This is not straight-forward as business models are still not well comprehended and the knowledge about business models is fragmented over different disciplines, such as information systems, strategy, innovation, and entrepreneurship. This whitepaper, ‘Understanding business models,’ introduces readers to business models. This whitepaper contributes to enhancing the understanding of business models, in particular the conceptualisation of business models by discussing and integrating business model definitions, frameworks and archetypes from different disciplines. After reading this whitepaper, the reader will have a well-developed understanding about what business models are and how the concept is sometimes interpreted and used in different ways. It will help the reader in assessing their own understanding of business models and that and of others. This will contribute to a better and more beneficial use of business models, an increase in shared understanding, and making it easier to work with business model techniques and tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human hair fibres are ubiquitous in nature and are found frequently at crime scenes often as a result of exchange between the perpetrator, victim and/or the surroundings according to Locard's Principle. Therefore, hair fibre evidence can provide important information for crime investigation. For human hair evidence, the current forensic methods of analysis rely on comparisons of either hair morphology by microscopic examination or nuclear and mitochondrial DNA analyses. Unfortunately in some instances the utilisation of microscopy and DNA analyses are difficult and often not feasible. This dissertation is arguably the first comprehensive investigation aimed to compare, classify and identify the single human scalp hair fibres with the aid of FTIR-ATR spectroscopy in a forensic context. Spectra were collected from the hair of 66 subjects of Asian, Caucasian and African (i.e. African-type). The fibres ranged from untreated to variously mildly and heavily cosmetically treated hairs. The collected spectra reflected the physical and chemical nature of a hair from the near-surface particularly, the cuticle layer. In total, 550 spectra were acquired and processed to construct a relatively large database. To assist with the interpretation of the complex spectra from various types of human hair, Derivative Spectroscopy and Chemometric methods such as Principal Component Analysis (PCA), Fuzzy Clustering (FC) and Multi-Criteria Decision Making (MCDM) program; Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Geometrical Analysis for Interactive Aid (GAIA); were utilised. FTIR-ATR spectroscopy had two important advantages over to previous methods: (i) sample throughput and spectral collection were significantly improved (no physical flattening or microscope manipulations), and (ii) given the recent advances in FTIR-ATR instrument portability, there is real potential to transfer this work.s findings seamlessly to on-field applications. The "raw" spectra, spectral subtractions and second derivative spectra were compared to demonstrate the subtle differences in human hair. SEM images were used as corroborative evidence to demonstrate the surface topography of hair. It indicated that the condition of the cuticle surface could be of three types: untreated, mildly treated and treated hair. Extensive studies of potential spectral band regions responsible for matching and discrimination of various types of hair samples suggested the 1690-1500 cm-1 IR spectral region was to be preferred in comparison with the commonly used 1750-800 cm-1. The principal reason was the presence of the highly variable spectral profiles of cystine oxidation products (1200-1000 cm-1), which contributed significantly to spectral scatter and hence, poor hair sample matching. In the preferred 1690-1500 cm-1 region, conformational changes in the keratin protein attributed to the α-helical to β-sheet transitions in the Amide I and Amide II vibrations and played a significant role in matching and discrimination of the spectra and hence, the hair fibre samples. For gender comparison, the Amide II band is significant for differentiation. The results illustrated that the male hair spectra exhibit a more intense β-sheet vibration in the Amide II band at approximately 1511 cm-1 whilst the female hair spectra displayed more intense α-helical vibration at 1520-1515cm-1. In terms of chemical composition, female hair spectra exhibit greater intensity of the amino acid tryptophan (1554 cm-1), aspartic and glutamic acid (1577 cm-1). It was also observed that for the separation of samples based on racial differences, untreated Caucasian hair was discriminated from Asian hair as a result of having higher levels of the amino acid cystine and cysteic acid. However, when mildly or chemically treated, Asian and Caucasian hair fibres are similar, whereas African-type hair fibres are different. In terms of the investigation's novel contribution to the field of forensic science, it has allowed for the development of a novel, multifaceted, methodical protocol where previously none had existed. The protocol is a systematic method to rapidly investigate unknown or questioned single human hair FTIR-ATR spectra from different genders and racial origin, including fibres of different cosmetic treatments. Unknown or questioned spectra are first separated on the basis of chemical treatment i.e. untreated, mildly treated or chemically treated, genders, and racial origin i.e. Asian, Caucasian and African-type. The methodology has the potential to complement the current forensic analysis methods of fibre evidence (i.e. Microscopy and DNA), providing information on the morphological, genetic and structural levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel antioxidant for the potential treatment of ischaemia was designed by incorporating an isoindoline nitroxide into the framework of the free radical scavenger edaravone. 5-(3-Methyl-pyrazol-5-ol-1-yl)-1,1,3,3-tetramethylisoindolin-2-yloxyl 7 was prepared by N-arylation of 3-methyl-5-pyrazolone with 5-iodo-1,1,3,3-tetramethylisoindoline-2-yloxyl 8 in the presence of catalytic copper(I)iodide. Evaluation of 7, its methoxyamine derivative 10 and 5-carboxy-1,1,3,3-tetramethylisoindolin-2-yloxyl (CTMIO) against edaravone 1 in ischaemic rat atrial cardiomyocytes revealed significant decreases in cell death after prolonged ischaemia for each agent; however the protective effect of the novel antioxidant 7 (showing greater than 85% reduction in cell death at 100 μM) was significantly enhanced over that of edaravone 1 alone. Furthermore, the activity for 7 was found to be equal to or greater than the potent cardioprotective agent N6-cyclopentyladenosine (CPA). The methoxyamine adduct 10 and edaravone 1 showed no difference between the extent of reduction in cell death whilst CTMIO had only a modest protective effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article describes an exercise in collective narrative practice, built around the metaphor of adventure. This metaphor helped to scaffold the development of stories of personal agency for a group of Australian primary school children whose teachers were afraid they might be traumatised by events which occurred during a school excursion. During the excursion, the group of 110 Year 5 and 6 school children had their accommodation broken into on two separate occasions and various belongings stolen. The very brief period made available for ‘debriefing’ was used to introduce the metaphor of adventure, and open up space for the children to begin constructing a story in which they were ‘powerful’, as an alternative to the story of powerlessness and victimhood in which they were initially caught up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A forced landing is an unscheduled event in flight requiring an emergency landing, and is most commonly attributed to engine failure, failure of avionics or adverse weather. Since the ability to conduct a successful forced landing is the primary indicator for safety in the aviation industry, automating this capability for unmanned aerial vehicles (UAVs) will help facilitate their integration into, and subsequent routine operations over civilian airspace. Currently, there is no commercial system available to perform this task; however, a team at the Australian Research Centre for Aerospace Automation (ARCAA) is working towards developing such an automated forced landing system. This system, codenamed Flight Guardian, will operate onboard the aircraft and use machine vision for site identification, artificial intelligence for data assessment and evaluation, and path planning, guidance and control techniques to actualize the landing. This thesis focuses on research specific to the third category, and presents the design, testing and evaluation of a Trajectory Generation and Guidance System (TGGS) that navigates the aircraft to land at a chosen site, following an engine failure. Firstly, two algorithms are developed that adapts manned aircraft forced landing techniques to suit the UAV planning problem. Algorithm 1 allows the UAV to select a route (from a library) based on a fixed glide range and the ambient wind conditions, while Algorithm 2 uses a series of adjustable waypoints to cater for changing winds. A comparison of both algorithms in over 200 simulated forced landings found that using Algorithm 2, twice as many landings were within the designated area, with an average lateral miss distance of 200 m at the aimpoint. These results present a baseline for further refinements to the planning algorithms. A significant contribution is seen in the design of the 3-D Dubins Curves planning algorithm, which extends the elementary concepts underlying 2-D Dubins paths to account for powerless flight in three dimensions. This has also resulted in the development of new methods in testing for path traversability, in losing excess altitude, and in the actual path formation to ensure aircraft stability. Simulations using this algorithm have demonstrated lateral and vertical miss distances of under 20 m at the approach point, in wind speeds of up to 9 m/s. This is greater than a tenfold improvement on Algorithm 2 and emulates the performance of manned, powered aircraft. The lateral guidance algorithm originally developed by Park, Deyst, and How (2007) is enhanced to include wind information in the guidance logic. A simple assumption is also made that reduces the complexity of the algorithm in following a circular path, yet without sacrificing performance. Finally, a specific method of supplying the correct turning direction is also used. Simulations have shown that this new algorithm, named the Enhanced Nonlinear Guidance (ENG) algorithm, performs much better in changing winds, with cross-track errors at the approach point within 2 m, compared to over 10 m using Park's algorithm. A fourth contribution is made in designing the Flight Path Following Guidance (FPFG) algorithm, which uses path angle calculations and the MacCready theory to determine the optimal speed to fly in winds. This algorithm also uses proportional integral- derivative (PID) gain schedules to finely tune the tracking accuracies, and has demonstrated in simulation vertical miss distances of under 2 m in changing winds. A fifth contribution is made in designing the Modified Proportional Navigation (MPN) algorithm, which uses principles from proportional navigation and the ENG algorithm, as well as methods specifically its own, to calculate the required pitch to fly. This algorithm is robust to wind changes, and is easily adaptable to any aircraft type. Tracking accuracies obtained with this algorithm are also comparable to those obtained using the FPFG algorithm. For all three preceding guidance algorithms, a novel method utilising the geometric and time relationship between aircraft and path is also employed to ensure that the aircraft is still able to track the desired path to completion in strong winds, while remaining stabilised. Finally, a derived contribution is made in modifying the 3-D Dubins Curves algorithm to suit helicopter flight dynamics. This modification allows a helicopter to autonomously track both stationary and moving targets in flight, and is highly advantageous for applications such as traffic surveillance, police pursuit, security or payload delivery. Each of these achievements serves to enhance the on-board autonomy and safety of a UAV, which in turn will help facilitate the integration of UAVs into civilian airspace for a wider appreciation of the good that they can provide. The automated UAV forced landing planning and guidance strategies presented in this thesis will allow the progression of this technology from the design and developmental stages, through to a prototype system that can demonstrate its effectiveness to the UAV research and operations community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report provides an account of the first large-scale scoping study of work integrated learning (WIL) in contemporary Australian higher education. The explicit aim of the project was to identify issues and map a broad and growing picture of WIL across Australia and to identify ways of improving the student learning experience in relation to WIL. The project was undertaken in response to high levels of interest in WIL, which is seen by universities both as a valid pedagogy and as a means to respond to demands by employers for work-ready graduates, and demands by students for employable knowledge and skills. Over a period of eight months of rapid data collection, 35 universities and almost 600 participants contributed to the project. Participants consistently reported the positive benefits of WIL and provided evidence of commitment and innovative practice in relation to enhancing student learning experiences. Participants provided evidence of strong partnerships between stakeholders and highlighted the importance of these relationships in facilitating effective learning outcomes for students. They also identified a range of issues and challenges that face the sector in growing WIL opportunities; these issues and challenges will shape the quality of WIL experiences. While the majority of comments focused on issues involved in ensuring quality placements, it was recognised that placements are just one way to ensure the integration of work with learning. Also, the WIL experience is highly contextualised and impacted by the expectations of students, employers, the professions, the university and government policy.