63 resultados para PART I
Resumo:
Achieving business and IT integration is strategic goal for many organisations – it has almost become the ‘Holy Grail’ of organisational success. In this environment Enterprise Resource Planning (ERP) packages have become the defacto option for addressing this issue. Integration has come to mean adopting ERP, through configuration and without customization, but this all or nothing approach has proved difficult for many organisations. In part 1 of a 2 part update we provide evidence from the field that suggests that whilst costly, if managed appropriately, customization can have value in aiding organisational integration efforts. In part 2, we discuss in more detail the benefits and pitfalls involved in enacting a non-standard based integration strategy.
Resumo:
Explores how young people in Australia first come to inject drugs and how they learn about hepatitis C and sterile injecting drug use. Background on hepatitis C; Reasons for injecting drugs; Selection criteria for young people's participation in the i2i Project.
Resumo:
Bone generation by autogenous cell transplantation in combination with a biodegradable scaffold is one of the most promising techniques being developed in craniofacial surgery. The objective of this combined in vitro and in vivo study was to evaluate the morphology and osteogenic differentiation of bone marrow derived mesenchymal progenitor cells and calvarial osteoblasts in a two-dimensional (2-D) and three-dimensional (3-D) culture environment (Part I of this study) and their potential in combination with a biodegradable scaffold to reconstruct critical-size calvarial defects in an autologous animal model [Part II of this study; see Schantz, J.T., et al. Tissue Eng. 2003;9(Suppl. 1):S-127-S-139; this issue]. New Zealand White rabbits were used to isolate osteoblasts from calvarial bone chips and bone marrow stromal cells from iliac crest bone marrow aspirates. Multilineage differentiation potential was evaluated in a 2-D culture setting. After amplification, the cells were seeded within a fibrin matrix into a 3-D polycaprolactone (PCL) scaffold system. The constructs were cultured for up to 3 weeks in vitro and assayed for cell attachment and proliferation using phase-contrast light, confocal laser, and scanning electron microscopy and the MTS cell metabolic assay. Osteogenic differentiation was analyzed by determining the expression of alkaline phosphatase (ALP) and osteocalcin. The bone marrow-derived progenitor cells demonstrated the potential to be induced to the osteogenic, adipogenic, and chondrogenic pathways. In a 3-D environment, cell-seeded PCL scaffolds evaluated by confocal laser microscopy revealed continuous cell proliferation and homogeneous cell distribution within the PCL scaffolds. On osteogenic induction mesenchymal progenitor cells (12 U/L) produce significantly higher (p < 0.05) ALP activity than do osteoblasts (2 U/L); however, no significant differences were found in osteocalcin expression. In conclusion, this study showed that the combination of a mechanically stable synthetic framework (PCL scaffolds) and a biomimetic hydrogel (fibrin glue) provides a potential matrix for bone tissue-engineering applications. Comparison of osteogenic differentiation between the two mesenchymal cell sources revealed a similar pattern.
Performance of elite seated discus throwers in F30s classes : part II: does feet positioning matter?
Resumo:
Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.
Resumo:
Two lecture notes describe recent developments of evolutionary multi objective optimization (MO) techniques in detail and their advantages and drawbacks compared to traditional deterministic optimisers. The role of Game Strategies (GS), such as Pareto, Nash or Stackelberg games as companions or pre-conditioners of Multi objective Optimizers is presented and discussed on simple mathematical functions in Part I , as well as their implementations on simple aeronautical model optimisation problems on the computer using a friendly design framework in Part II. Real life (robust) design applications dealing with UAVs systems or Civil Aircraft and using the EAs and Game Strategies combined material of Part I & Part II are solved and discussed in Part III providing the designer new compromised solutions useful to digital aircraft design and manufacturing. Many details related to Lectures notes Part I, Part II and Part III can be found by the reader in [68].
Resumo:
Video installation, Metrolpolis: Part I-III, using 1-channel HD video with surround sound. At the end of the first decade of the twenty–first century, contemporary culture appears increasingly seduced and absorbed by apocalyptic reveries. Scientists are racing to cryo-preserve genetic material from animals and plant matter in underground bunkers, while filmmakers use the spectacle of computer-generated imagery (CGI) to speculate on the outcomes from dramatic climate change, that we are not yet ready to confront in reality... Premier's new media art prize 2010: http://www.qagoma.qld.gov.au/exhibitions/past/2010/premier_of_queenslands_national_new_media_art_award_2010/chris_howlett
Resumo:
We present global and regional rates of brain atrophy measured on serially acquired Tl-weighted brain MR images for a group of Alzheimer's disease (AD) patients and age-matched normal control (NC) subjects using the analysis procedure described in Part I. Three rates of brain atrophy: the rate of atrophy in the cerebrum, the rate of lateral ventricular enlargement and the rate of atrophy in the region of temporal lobes, were evaluated for 14 AD patients and 14 age-matched NC subjects. All three rates showed significant differences between the two groups. However, the greatest separation of the two groups was obtained when the regional rates were combined. This application has demonstrated that rates of brain atrophy, especially in specific regions of the brain, based on MR images can provide sensitive measures for evaluating the progression of AD. These measures will be useful for the evaluation of therapeutic effects of novel therapies for AD.
Resumo:
This paper describes a novel framework for facial expression recognition from still images by selecting, optimizing and fusing ‘salient’ Gabor feature layers to recognize six universal facial expressions using the K nearest neighbor classifier. The recognition comparisons with all layer approach using JAFFE and Cohn-Kanade (CK) databases confirm that using ‘salient’ Gabor feature layers with optimized sizes can achieve better recognition performance and dramatically reduce computational time. Moreover, comparisons with the state of the art performances demonstrate the effectiveness of our approach.
Resumo:
This reader in popular cultural studies meets the need for an up-to-date collection of readings on contemporary youth cultures and youth music. Table of Content: Introduction: Reading Pop(ular) Cult(ural) Stud(ie)s: Steve Redhead. Part I: Theory I:. 1. Pearls and Swine: Intellectuals and the Mass Media: Simon Frith and Jon Savage. 2. Over-the-Counter Culture: Retheorising Resistance in Popular Culture: Beverly Best. Part II: Commentaries. 3. Organised Disorder: The Changing Space of the Record Shop: Will Straw. 4. Spatial Politics: A Gendered Sense of Place: Cressida Miles. 5. Let's All Have a Disco? Football, Popular Music and Democratisation: Adam Brown. 6. Rave Culture: Living Dream or Living Death?: Simon Reynolds. 7. Fear and Lothing in Wisconsin: Sarah Champion. 8. The House Sound of Chicago: Hillegonda Rietveld. 9. Cocaine Girls: Marek Kohn. 10. In the Supermarket of Style: Ted Polhemus. 11. Love Factory: The Sites, Practices and Media Relationships of Northern Soul: Kate Milestone. 12. DJ Culture: Dave Haslam. Plates: Patrick Henry. Part III: Theory II: . 13. The Post-Subculturalist: David Muggleton. 14. Reading Pop: The Press, the Scholar and the Consequences of Popular Cultural Studies: Steve Jones. 15. Re-placing Popular Culture: Lawrence Grossberg. Index.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
In this article we survey relevant international literature on the issue of parental liability and responsibility for the crimes of young offenders. In addition, as a starting point for needed cross-jurisdictional research, we focus on different approaches that have been taken to making parents responsible for youth crime in Australia and Canada. This comparative analysis of Australian and Canadian legislative and policy approaches is situated within a broader discussion of arguments about parental responsibility, the ‘punitive turn’ in youth justice, and cross-jurisdictional criminal justice policy transfer and convergence. One unexpected finding of our literature survey is the relatively sparse attention given to the issue of parental responsibility for youth crime in legal and criminological literature compared to the attention it receives in the media and popular-public culture. In Part I we examine the different views that have been articulated in the social science literature for and against parental responsibility laws, along with arguments that have been made about why such laws have been enacted in an increasing number of Western countries in recent years. In Part II, we situate our comparative study of Australian and Canadian legislative and policy approaches within a broader discussion of arguments about the ‘punitive turn’ in youth justice, responsibilisation, and cross-jurisdictional criminal justice policy transfer and convergence. In Part III, we identify and examine the scope of different parental responsibility laws that have been enacted in Australia and Canada; noting significant differences in the manner and extent to which parental responsibility laws and policies have been invoked as part of the solution to dealing with youth crime. In our concluding discussion, in Part IV, we try to speculate on some of the reasons for these differences and set an agenda for needed future research on the topic.
Resumo:
The concept of "fair basing" is widely acknowledged as a difficult area of patent law. This article maps the development of fair basing law to demonstrate how some of the difficulties have arisen. Part I of the article traces the development of the branches of patent law that were swept under the nomenclature of "fair basing" by British legislation in 1949. It looks at the early courts' approach to patent construction, examines the early origin of fair basing and what it was intended to achiever. Part II of the article considers the modern interpretation of fair basing, which provides a striking contrast to its historical context. Without any consistent judicial approach to construction the doctrine has developed inappropriately, giving rise to both over-strict and over-generous approaches.
Resumo:
Part I of this book covers the commercial and contractual background to technology licensing agreements. Part II discusses the European Community's new regime on the application and enforcement of Article 81 to technology licensing agreements. EC Council Regulation 1/2003 replaced the Council Regulation 17/1962 and repealed the system under which restrictive agreements and practices could be notified to the EC Commission. A new Commission regulation on technology transfer agreements, Regulation 772/2004. These two enactments required consequential amendments to the chapters in Part III where the usual terms of technology licensing agreements are analysed and exemplified by reference to decided cases.
Resumo:
This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent