509 resultados para commercial language technology
Resumo:
Automatic spoken Language Identi¯cation (LID) is the process of identifying the language spoken within an utterance. The challenge that this task presents is that no prior information is available indicating the content of the utterance or the identity of the speaker. The trend of globalization and the pervasive popularity of the Internet will amplify the need for the capabilities spoken language identi¯ca- tion systems provide. A prominent application arises in call centers dealing with speakers speaking di®erent languages. Another important application is to index or search huge speech data archives and corpora that contain multiple languages. The aim of this research is to develop techniques targeted at producing a fast and more accurate automatic spoken LID system compared to the previous National Institute of Standards and Technology (NIST) Language Recognition Evaluation. Acoustic and phonetic speech information are targeted as the most suitable fea- tures for representing the characteristics of a language. To model the acoustic speech features a Gaussian Mixture Model based approach is employed. Pho- netic speech information is extracted using existing speech recognition technol- ogy. Various techniques to improve LID accuracy are also studied. One approach examined is the employment of Vocal Tract Length Normalization to reduce the speech variation caused by di®erent speakers. A linear data fusion technique is adopted to combine the various aspects of information extracted from speech. As a result of this research, a LID system was implemented and presented for evaluation in the 2003 Language Recognition Evaluation conducted by the NIST.
Resumo:
The incidence of self-service technology, where the consumer delivers the service themselves using technology, is increasing in the service encounter. One area that is under-explored is the potential impact of self-service technology on consumer satisfaction and affective commitment. Accordingly, this paper presents an empirical study that investigates the relative impact of self-service technology on consumer satisfaction (both overall and transaction-specific) and affective commitment, accounting for the moderating effects of consumer characteristics. The results highlight the importance of personal service for evaluations of satisfaction and commitment, and the importance of social competency as a moderator in this relationship. An understanding of these consumer perceptions will allow organisations to develop strategies to deliver the services expected by their consumers, improving consumer satisfaction and commitment.
Resumo:
What is a record producer? There is a degree of mystery and uncertainty about just what goes on behind the studio door. Some producers are seen as Svengali-like figures manipulating artists into mass consumer product. Producers are sometimes seen as mere technicians whose job is simply to set up a few microphones and press the record button. Close examination of the recording process will show how far this is from a complete picture. Artists are special—they come with an inspiration, and a talent, but also with a variety of complications, and in many ways a recording studio can seem the least likely place for creative expression and for an affective performance to happen. The task of the record producer is to engage with these artists and their songs and turn these potentials into form through the technology of the recording studio. The purpose of the exercise is to disseminate this fixed form to an imagined audience—generally in the hope that this audience will prove to be real. Finding an audience is the role of the record company. A record producer must also engage with the commercial expectations of the interests that underwrite a recording. This dissertation considers three fields of interest in the recording process: the performer and the song; the technology of the recording context; and the commercial ambitions of the record company—and positions the record producer as a nexus at the interface of all three. The author reports his structured recollection of five recordings, with three different artists, that all achieved substantial commercial success. The processes are considered from the author’s perspective as the record producer, and from inception of the project to completion of the recorded work. What were the processes of engagement? Do the actions reported conform to the template of nexus? This dissertation proposes that in all recordings the function of producer/nexus is present and necessary—it exists in the interaction of the artistry and the technology. The art of record production is to engage with these artists and the songs they bring and turn these potentials into form.
Resumo:
The paper details the results of the first phase of an on-going research into the sociocultural factors that influence the supervision of higher degrees research (HDR) engineering students in the Faculty of Built Environment and Engineering (BEE) and Faculty of Science and Technology (FaST) at Queensland University of Technology. A quantitative analysis was performed on the results from an online survey that was administered to 179 engineering students. The study reveals that cultural barriers impact their progression and developing confidence in their research programs. We argue that in order to assist international and non-English speaking background (NESB) research students to triumph over such culturally embedded challenges in engineering research, it is important for supervisors to understand this cohort's unique pedagogical needs and develop intercultural sensitivity in their pedagogical practice in postgraduate research supervision. To facilitate this, the governing body (Office of Research) can play a vital role in not only creating the required support structures but also their uniform implementation across the board.
Resumo:
"This column is distinguished from previous Impact columns in that it concerns the development tightrope between research and commercial take-up and the role of the LGPL in an open source workflow toolkit produced in a University environment. Many ubiquitous systems have followed this route, (Apache, BSD Unix, ...), and the lessons this Service Oriented Architecture produces cast yet more light on how software diffuses out to impact us all." Michiel van Genuchten and Les Hatton Workflow management systems support the design, execution and analysis of business processes. A workflow management system needs to guarantee that work is conducted at the right time, by the right person or software application, through the execution of a workflow process model. Traditionally, there has been a lack of broad support for a workflow modeling standard. Standardization efforts proposed by the Workflow Management Coalition in the late nineties suffered from limited support for routing constructs. In fact, as later demonstrated by the Workflow Patterns Initiative (www.workflowpatterns.com), a much wider range of constructs is required when modeling realistic workflows in practice. YAWL (Yet Another Workflow Language) is a workflow language that was developed to show that comprehensive support for the workflow patterns is achievable. Soon after its inception in 2002, a prototype system was built to demonstrate that it was possible to have a system support such a complex language. From that initial prototype, YAWL has grown into a fully-fledged, open source workflow management system and support environment
Resumo:
The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.
Resumo:
SAP and its research partners have been developing a lan- guage for describing details of Services from various view- points called the Unified Service Description Language (USDL). At the time of writing, version 3.0 describes technical implementation aspects of services, as well as stakeholders, pricing, lifecycle, and availability. Work is also underway to address other business and legal aspects of services. This language is designed to be used in service portfolio management, with a repository of service descriptions being available to various stakeholders in an organisation to allow for service prioritisation, development, deployment and lifecycle management. The structure of the USDL metadata is specified using an object-oriented metamodel that conforms to UML, MOF and EMF Ecore. As such it is amenable to code gener-ation for implementations of repositories that store service description instances. Although Web services toolkits can be used to make these programming language objects available as a set of Web services, the practicalities of writing dis- tributed clients against over one hundred class definitions, containing several hundred attributes, will make for very large WSDL interfaces and highly inefficient “chatty” implementations. This paper gives the high-level design for a completely model-generated repository for any version of USDL (or any other data-only metamodel), which uses the Eclipse Modelling Framework’s Java code generation, along with several open source plugins to create a robust, transactional repository running in a Java application with a relational datastore. However, the repository exposes a generated WSDL interface at a coarse granularity, suitable for distributed client code and user-interface creation. It uses heuristics to drive code generation to bridge between the Web service and EMF granularities.
Resumo:
In their quest for resources to support children’s early literacy learning and development, parents encounter and traverse different spaces in which discourses and artifacts are produced and circulated. This paper uses conceptual tools from the field of geosemiotics to examine some commercial spaces designed for parents and children which foreground preschool learning and development. Drawing on data generated in a wider study I discuss some of the ways in which the material and virtual commercial spaces of a transnational shopping mall company and an educational toy company operate as sites of encounter between discourses and artifacts about children’s early learning and parents of preschoolers. I consider how companies connect with and ‘situate’ people as parents and customers, and then offer pathways designed for parents to follow as they attempt to meet their very young children’s learning and development needs. I argue that these pathways are both material and ideological, and that are increasingly tending to lead parents to the online commercial spaces of the world wide web. I show how companies are using the online environment and hybrid offline and online spaces and flows to reinforce an image of themselves as authoritative brokers of childhood resources for parents that is highly valuable in a policy climate which foregrounds lifelong learning and school readiness.
Resumo:
There has been minimal research focused on short-term study abroad language immersion programs, in particular, with home-stay families. The importance of authentic intercultural experience is increasingly clear and was acknowledged as central to the process of language learning (Liddicoat, 2004). In Hong Kong, education programs for pre-service language teachers have significantly emphasised language and intercultural training through short-term study abroad, and these short overseas language immersion courses have become a compulsory component for teacher training (Bodycott & Crew, 2001) in the last decade. This study aims to investigate eight Hong Kong pre-service teachers’ and their home-stay families’ experiences of a short-term (two months) language immersion program in Australia. The focus is on listening to commentaries concerning the development of communicative competence, intercultural competence and professional growth during the out-of-class study abroad experience. The conceptual framework adopted in this study views language and intercultural learning from social constructivist perspectives. Central to this framing is the notion that the internalisation of higher mental functions involves the transfer from the inter-psychological to the intra-psychological plane, that is, a progression process from the socially supported to individually controlled performance. From this perspective, language serves as a way to communicate about, and in relation to, actions and experience. Three research questions were addressed and studied through qualitative methodology. 1. How do the pre-service teachers and their home-stay families perceive the out-of-class component of the program in terms of opportunities for the development of language proficiency and communicative competence? 2. How do the pre-service teachers and their home-stay families perceive the out-of-class component of the program in terms of the development of intercultural competence? 3. How do the pre-service teachers and home-stay families perceive the outof- class component of the program in terms of teachers’ professional growth? Data were generated from multiple data collection methods and analysed through thematic analysis from both a “bottom up” and “top down” approach. The study showed that the pre-service teachers perceived that the immersion program influenced, to varying degrees, their language proficiency, communication and intercultural awareness, as well as their self-awareness and professional growth. These pre-service teachers believed that effective language learning centres on active engagement in the target language community. A mismatch between the views and evaluations of the two groups – the pre-service teachers and the home-stay family members – provides some evidence of misalignments in terms of expectations and perceptions of each other’s roles and responsibilities. The study has highlighted challenges encountered, and provided suggestions for ways of meeting these challenges. The inclusion in the study of the home-stay families’ perceptions and commentaries provided insights, which can inform program development. There is clearly further work to be done in terms of predeparture orientation and preparation, not only for the main participants themselves, the students, but also for the host families.
Resumo:
This paper examines the interactions between knowledge and power in the adoption of technologies central to municipal water supply plans, specifically investigating decisions in Progressive Era Chicago regarding water meters. The invention and introduction into use of the reliable water meter early in the Progressive Era allowed planners and engineers to gauge water use, and enabled communities willing to invest in the new infrastructure to allocate costs for provision of supply to consumers relative to use. In an era where efficiency was so prized and the role of technocratic expertise was increasing, Chicago’s continued failure to adopt metering (despite levels of per capita consumption nearly twice that of comparable cities and acknowledged levels of waste nearing half of system production) may indicate that the underlying characteristics of the city’s political system and its elite stymied the implementation of metering technologies as in Smith’s (1977) comparative study of nineteenth century armories. Perhaps, as with Flyvbjerg’s (1998) study of the city of Aalborg, the powerful know what they want and data will not interfere with their conclusions: if the data point to a solution other than what is desired, then it must be that the data are wrong. Alternatively, perhaps the technocrats failed adequately to communicate their findings in a language which the political elite could understand, with the failure lying in assumptions of scientific or technical literacy rather than with dissatisfaction in outcomes (Benveniste 1972). When examined through a historical institutionalist perspective, the case study of metering adoption lends itself to exploration of larger issues of knowledge and power in the planning process: what governs decisions regarding knowledge acquisition, how knowledge and power interact, whether the potential to improve knowledge leads to changes in action, and, whether the decision to overlook available knowledge has an impact on future decisions.
Resumo:
The indecision surrounding the definition of Technology extends to the classroom as not knowing what a subject “is” affects how it is taught. Similarly, its relative newness – and consequent lack of habitus in school settings - means that it is still struggling to find its own place in the curriculum as well as resolve its relationship with more established subject domains, particularly Science and Mathematics. The guidance from syllabus documents points to open-ended student-directed projects where extant studies indicate a more common experience of teacher –directed activities and an emphasis on product over process. There are issues too for researchers in documenting classroom observations and in analysing teacher practice in new learning environments. This paper presents a framework for defining and mapping classroom practice and for attempting to describe the social practice in the Technology classroom. The framework is a bricolage which draws on contemporary research. More formally, the development of the framework is consonant with the aim of design-based research to develop a flexible, adaptive and generalisable theory to better understanding a teaching domain where promise is not seen to match current reality. The framework may also inform emergent approaches to STEM (Science, Technology, Education and Mathematics) in education.
Resumo:
Background The vast sequence divergence among different virus groups has presented a great challenge to alignment-based analysis of virus phylogeny. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignment could not be directly applied to the whole-genome comparison and phylogenomic studies of viruses. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. Among the alignment-free methods, a dynamical language (DL) method proposed by our group has successfully been applied to the phylogenetic analysis of bacteria and chloroplast genomes. Results In this paper, the DL method is used to analyze the whole-proteome phylogeny of 124 large dsDNA viruses and 30 parvoviruses, two data sets with large difference in genome size. The trees from our analyses are in good agreement to the latest classification of large dsDNA viruses and parvoviruses by the International Committee on Taxonomy of Viruses (ICTV). Conclusions The present method provides a new way for recovering the phylogeny of large dsDNA viruses and parvoviruses, and also some insights on the affiliation of a number of unclassified viruses. In comparison, some alignment-free methods such as the CV Tree method can be used for recovering the phylogeny of large dsDNA viruses, but they are not suitable for resolving the phylogeny of parvoviruses with a much smaller genome size.
Resumo:
In Australia, trials conducted as 'electronic trials' have ordinarily run with the assistance of commercial service providers, with the associated costs being borne by the parties. However, an innovative approach has been taken by the courts in Queensland. In October 2007 Queensland became the first Australian jurisdiction to develop its own court-provided technology, to facilitate the conduct of an electronic trial. This technology was first used in the conduct of civil trials. The use of the technology in the civil sphere highlighted its benefits and, more significantly, demonstrated the potential to achieve much greater efficiencies. The Queensland courts have now gone further, using the court-provided technology in the high proffle criminal trial of R v Hargraves, Hargraves and Stoten, in which the three accused were tried for conspiracy to defraud the Commonwealth of Australia of about $3.7 million in tax. This paper explains the technology employed in this case and reports on the perspectives of all of the participants in the process. The representatives for all parties involved in this trial acknowledged, without reservation, that the use of the technology at trial produced considerable overall efficiencies and costs savings. The experience in this trial also demonstrates that the benefits of trial technology for the criminal justice process are greater than those for civil litigation. It shows that, when skilfully employed, trial technology presents opportunities to enhance the fairness of trials for accused persons. The paper urges governments, courts and the judiciary in all jurisdictions to continue their efforts to promote change, and to introduce mechanisms to facilitate more broadly a shift from the entrenched paper-based approach to both criminal and civil procedure to one which embraces more broadly the enormous benefits trial technology has to offer.
Resumo:
When we attempt to speak about the relationship between language, literacy, and the brain, we find ourselves ill equipped to deal with these conceptually and qualitatively different phenomena. Immediately we must straddle different academic traditions that treat each of these as separate “things”. Broadly speaking, the study of language firstly belongs to the domain of biology, then to anthropology, sociology, and linguistics. At its most functional, a study of literacy education is a study of a particular technology, its diffusion techniques, and the abilities and motivations of people to adopt, or adapt themselves to, this technology. The brain is most commonly studied in the field of neurology, which is also a sub-discipline of biology, biochemistry, and medicine.