147 resultados para Catalan language -- To 1500 -- Word order -- Congresses


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Novice programmers have difficulty developing an algorithmic solution while simultaneously obeying the syntactic constraints of the target programming language. To see how students fare in algorithmic problem solving when not burdened by syntax, we conducted an experiment in which a large class of beginning programmers were required to write a solution to a computational problem in structured English, as if instructing a child, without reference to program code at all. The students produced an unexpectedly wide range of correct, and attempted, solutions, some of which had not occurred to their teachers. We also found that many common programming errors were evident in the natural language algorithms, including failure to ensure loop termination, hardwiring of solutions, failure to properly initialise the computation, and use of unnecessary temporary variables, suggesting that these mistakes are caused by inexperience at thinking algorithmically, rather than difficulties in expressing solutions as program code.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper uses qualitative textual analysis to compare journalistic and academic accounts of child sexual abuse. There are seven main differences. Academic accounts suggest higher levels of neglect, emotional abuse, and physical abuse than sexual abuse in Australia, by contrast, journalistic accounts highlight sexual abuse. Academic accounts suggest that child sexual abuse in Australia is decreasing; journalistic accounts suggest that it is increasing. Academic accounts suggest that the majority of cases of child sexual abuse are perpetrated by family members; journalistic accounts focus on abuse by institutional figures (teachers, priests) or by strangers. Academic accounts have shown that innocent sexual play is a normal part of childhood development; journalistic accounts suggest that any sexual play is either a sign of abuse, or in itself constitutes sexual abuse. Academic accounts suggest that one of the best ways to prevent sexual abuse is for children to receive sex education; journalistic accounts suggest that children finding out about sex leads to sexual abuse. Academic accounts can gather data from the victims; journalistic accounts are excluded from doing so. Academic researchers talk to abusers in order to understand how child sexual abuse can be prevented; journalistic accounts exclude the voices of child sexual abusers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SITDRM 1 is a privacy protection system that protects private data through the enforcement of MPEG REL licenses provided by consumers. Direct issuing of licenses by consumers has several usability problems that will be mentioned in this paper. Further, we will describe how SITDRM incorporates P3P language to provide a consumer-centered privacy protection system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A body of critical legal scholarship argues that, by the time they have completed their studies, students who enter legal education holding social ideals and intending to use their legal education to achieve social change, have become cynical about the ability of the law to do so and no longer possess such ideals. This is explained by critical scholars to be the result of a process of ideological indoctrination, aimed at ensuring that graduates uphold the narrow and conservative interests of the legal profession and capitalist society, being exercised by law schools acting as adjuncts of the legal profession, and exercised upon the passive body of the law student. By using Foucault’s work on knowledge, power, and the subject to interrogate the assumptions upon which this narrative is based, this thesis intends to suggest a way of thinking differently to the approach taken by many critical legal scholars. It then uses an analytics of government (based on Foucault’s notion of ‘governmentality’) to consider the construction of the legal identity differently. It examines the ways in which the governance of the legal identity is rationalised, programmed, and implemented, in three Queensland law schools. It also looks at the way that five prescriptive texts to ‘surviving’ law school suggest students establish and practise a relation to themselves in order to construct their own legal identities. Overall, this analysis shows that governance is not simply conducted in the profession’s interests, but occurs due to a complex arrangement of different practices, which can lead to the construction of skilled legal professional identities as well as ethical lawyer-citizens that hold an interest in justice. The implications of such an analytics provide the basis for original ways of understanding legal education, and legal education scholarship.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The general aim of designated driver programs is to reduce the level of drink driving by encouraging potential drivers to travel with a driver who has abstained from (or at least limited) consuming alcohol. Designated driver programs are quite widespread around the world, however a limited number have been rigorously evaluated. This paper reports the qualitative results from an evaluation of a designated driver program known as ‘Skipper’, in a provincial city in Queensland. Focus groups were conducted with 108 individuals from the intervention area. These focus groups aimed to assess the barriers and facilitators to the programs’ effectiveness by obtaining information about the patrons’ views on various aspects of the program, as well as designated driver and travelling after drinking more generally. A brief questionnaire was also given to participants in order to present responses in terms of the participants’ characteristics. Results suggest general support for the designated driver concept and the ‘Skipper’ program specifically. Facilitating factors reported by participants included the media coverage highlighting the risks associated with drink driving and the social acceptability of choosing not to drink. However, there was also some suggestion that the impact of the program was mainly to encourage those who already engage in designated driver behaviour to continue doing so, rather than encouraging the uptake of the behaviour among potential new users. Some of the suggested barriers to this kind of behaviour change include: social pressure to drink; alcohol dependency; and a failure to plan ahead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We extended an earlier study (Vision Research, 45, 1967–1974, 2005) in which we investigated limits at which induced blur of letter targets becomes noticeable, troublesome and objectionable. Here we used a deformable adaptive optics mirror to vary spherical defocus for conditions of a white background with correction of astigmatism; a white background with reduction of all aberrations other than defocus; and a monochromatic background with reduction of all aberrations other than defocus. We used seven cyclopleged subjects, lines of three high-contrast letters as targets, 3–6 mm artificial pupils, and 0.1–0.6 logMAR letter sizes. Subjects used a method of adjustment to control the defocus component of the mirror to set the 'just noticeable', 'just troublesome' and 'just objectionable' defocus levels. For the white-no adaptive optics condition combined with 0.1 logMAR letter size, mean 'noticeable' blur limits were ±0.30, ±0.24 and ±0.23 D at 3, 4 and 6 mm pupils, respectively. White-adaptive optics and monochromatic-adaptive optics conditions reduced blur limits by 8% and 20%, respectively. Increasing pupil size from 3–6 mm decreased blur limits by 29%, and increasing letter size increased blur limits by 79%. Ratios of troublesome to noticeable, and of objectionable to noticeable, blur limits were 1.9 and 2.7 times, respectively. The study shows that the deformable mirror can be used to vary defocus in vision experiments. Overall, the results of noticeable, troublesome and objectionable blur agreed well with those of the previous study. Attempting to reduce higher-order aberrations or chromatic aberrations, reduced blur limits to only a small extent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This talk proceeds from the premise that IR should engage in a more substantial dialogue with cognitive science. After all, how users decide relevance, or how they chose terms to modify a query are processes rooted in human cognition. Recently, there has been a growing literature applying quantum theory (QT) to model cognitive phenomena. This talk will survey recent research, in particular, modelling interference effects in human decision making. One aspect of QT will be illustrated - how quantum entanglement can be used to model word associations in human memory. The implications of this will be briefly discussed in terms of a new approach for modelling concept combinations. Tentative links to human adductive reasoning will also be drawn. The basic theme behind this talk is QT can potentially provide a new genre of information processing models (including search) more aligned with human cognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Managed execution frameworks, such as the.NET Common Language Runtime or the Java Virtual Machine, provide a rich environment for the creation of application programs. These execution environments are ideally suited for languages that depend on type-safety and the declarative control of feature access. Furthermore, such frameworks typically provide a rich collection of library primitives specialized for almost every domain of application programming. Thus, when a new language is implemented on one of these frameworks it becomes necessary to provide some kind of mapping from the new language to the libraries of the framework. The design of such mappings is challenging since the type-system of the new language may not span the domain exposed in the library application programming interfaces (APIs). The nature of these design considerations was clarified in the implementation of the Gardens Point Component Pascal (gpcp) compiler. In this paper we describe the issues, and the solutions that we settled on in this case. The problems that were solved have a wider applicability than just our example, since they arise whenever any similar language is hosted in such an environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction clients often use financial incentives to encourage stakeholder motivation and commitment to voluntary higher-order project goals. Despite the increased use of financial incentives, there is little literature addressing means of optimizing outcomes. Using a case study methodology, the examination of a successful Australian construction project demonstrates the features of a positively geared procurement approach that promotes the effectiveness of financial incentives. The research results show that if the incentive system is perceived to be fair and is applied to reward exceptional performance, and not to manipulate, then contractors are more likely to be positively motivated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure-building phenomena within clay aggregates are governed by forces acting between clay particles. Measurements of such forces are important to understand in order to manipulate the aggregate structure for applications such as dewatering of mineral processing tailings. A parallel particle orientation is required when conducting XRD investigation on the oriented samples and conduct force measurements acting between basal planes of clay mineral platelets using at. force microscopy (AFM). To investigate how smectite clay platelets were oriented on silicon wafer substrate when dried from suspension range of methods like SEM, XRD and AFM were employed. From these investigations, we conclude that high clay concns. and larger particle diams. (up to 5 μm) in suspension result in random orientation of platelets in the substrate. The best possible laminar orientation in the clay dry film, represented in the XRD 0 0 1/0 2 0 intensity ratio of 47 was obtained by drying thin layers from 0.02 wt.% clay suspensions of the natural pH. Conducted AFM investigations show that smectite studied in water based electrolytes show very long-range repulsive forces lower in strength than electrostatic forces from double-layer repulsion. It was suggested that these forces may have structural nature. Smectite surface layers rehydrate in water environment forms surface gel with spongy and cellular texture which cushion approaching AFM probe. This structural effect can be measured in distances larger than 1000 nm from substrate surface and when probe penetrate this gel layer, structural linkages are forming between substrate and clay covered probe. These linkages prevent subsequently smooth detachments of AFM probe on way back when retrieval. This effect of tearing new formed structure apart involves larger adhesion-like forces measured in retrieval. It is also suggested that these effect may be enhanced by the nano-clay particles interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The critical problem of student disengagement and underachievement in the middle years of schooling (Years 4 . 9) has focussed attention on the quality of educational programs in schools, in Australia and elsewhere. The loss of enthusiasm for science in the middle years is particularly problematic given the growing demand for science professionals. Reshaping middle years programs has included an emphasis on integrating Information and Communication Technologies (ICTs) and improving assessment practices to engage students in higher cognitive processes and enhance academic rigour. Understanding the nature of academic rigour and how to embed it in students. science assessment tasks that incorporate the use of ICTs could enable teachers to optimise the quality of the learning environment. However, academic rigour is not clearly described or defined in the literature and there is little empirical evidence upon which researchers and teachers could draw to enhance understandings. This study used a collective case study design to explore teachers' understandings of academic rigour within science assessment tasks. The research design is based on a conceptual framework that is underpinned by socio-cultural theory. Three methods were used to collect data from six middle years teachers and their students. These methods were a survey, focus group discussion with teachers and a group of students and individual semi-structured interviews with teachers. Findings of the case study revealed six criteria of academic rigour, namely, higher order thinking, alignment, building on prior knowledge, scaffolding, knowledge construction and creativity. Results showed that the middle years teachers held rich understandings of academic rigour that led to effective utilisation of ICTs in science assessment tasks. Findings also indicated that teachers could further enhance their understandings of academic rigour in some aspects of each of the criteria. In particular, this study found that academic rigour could have been further optimised by: promoting more thoughtful discourse and interaction to foster higher order thinking; increasing alignment between curriculum, pedagogy, and assessment, and students. prior knowledge; placing greater emphasis on identifying, activating and building on prior knowledge; better differentiating the level of scaffolding provided and applying it more judiciously; fostering creativity throughout tasks; enhancing teachers‟ content knowledge and pedagogical content knowledge, and providing more in-depth coverage of fewer topics to support knowledge construction. Key contributions of this study are a definition and a model which clarify the nature of academic rigour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through an exploration of representations of metamorphosis and the creation of a body of written work, this thesis uses a critical examination of theoretical approaches to metamorphosis in combination with textual analysis of representations of metamorphosis and creative practice as research to arrive at the beginnings of an ethic of writing. The creative work, The Coming, consists of a collection of short fiction, The Coming, and two collections of poetry, Orison and Milagros. The exegesis, Transhuman Change: towards an ethic of writing, explores theories about metamorphosis as a figure for writing, as a trope, and as a motif for exploring identity to contextualise the analysis of representations of metamorphosis from which the ethic is developed. With reference to the psychosexual development theory of Jacques Lacan and Elaine Scarry’s philosophy of the body, pain, language and creativity, the exegesis examines existing approaches to metamorphosis and uses supplementary textual analysis of influential representations of metamorphosis from Ovid to Pygmalion, X-Men and Extreme Makeover to explore assumptions about the body, language, the self, gender in western culture. The limitations of the performance of representations of metamorphosis as a figure for the self’s survival of death are considered in the light of voice as metonym for self to propose an ethic which valorises life. The experience of sex and the construction of gender in representations of metamorphosis are considered in the light of Lacan’s theory of desire and Scarry’s theory of the body and language to propose an ethic of representing gender ironically. The motif of the faithless lover and the Pygmalion myth are considered in the light of the (m)other’s role in language to propose an ethic in which indeterminacy constitutes the condition for being aware of oneself among selves. Each of the three proposals is discussed in relation to the short fiction, memoir and poems produced in the course of this research to test their limits and possibilities as the foundation of an emerging ethic of writing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.