855 resultados para Technology in motion pictures
Resumo:
OBJECTIVES: 1. To critically evaluate a variety of mathematical methods of calculating effective population size (Ne) by conducting comprehensive computer simulations and by analysis of empirical data collected from the Moreton Bay population of tiger prawns. 2. To lay the groundwork for the application of the technology in the NPF. 3. To produce software for the calculation of Ne, and to make it widely available.
Resumo:
Permanent hearing loss is a leading global health care burden, with 1 in 10 people affected to a mild or greater degree. A shortage of trained healthcare professionals and associated infrastructure and resource limitations mean that hearing health services are unavailable to the majority of the world population. Utilizing information and communication technology in hearing health care, or tele-audiology, combined with automation offer unique opportunities for improved clinical care, widespread access to services, and more cost-effective and sustainable hearing health care. Tele-audiology demonstrates significant potential in areas such as education and training of hearing health care professionals, paraprofessionals, parents, and adults with hearing disorders; screening for auditory disorders; diagnosis of hearing loss; and intervention services. Global connectivity is rapidly growing with increasingly widespread distribution into underserved communities where audiological services may be facilitated through telehealth models. Although many questions related to aspects such as quality control, licensure, jurisdictional responsibility, certification and reimbursement still need to be addressed; no alternative strategy can currently offer the same potential reach for impacting the global burden of hearing loss in the near and foreseeable future.
Resumo:
Decision In the Matter of Gray highlights complications that advancing medical technology causes to the law - case concerns the issue of removal of semen from a deceased man - how the courts deal with matters concerning medical technology in the absence of specific legislation or established case law - legal and moral questions raised by the case.
Resumo:
The use of electrotransfer for DNA delivery to prokaryotic cells, and eukaryotic cells in vitro, has been well known and widely used for many years. However, it is only recently that electric fields have been used to enhance DNA transfer to animal cells in vivo, and this is known as DNA electrotransfer or in vivo DNA electroporation. Some of the advantages of this method of somatic cell gene transfer are that it is a simple method that can be used to transfer almost any DNA construct to animal cells and tissues in vivo; multiple constructs can be co-transfected; it is equally applicable to dividing and nondividing cells; the DNA of interest does not need to be subeloned into a specific viral transfer vector and there is no need for the production of high titre viral stocks; and, as no viral genes are expressed there is less chance of an adverse immunologic reaction to vector sequences. The ease with which efficient in vivo gene transfer can be achieved with in vivo DNA electrotransfer is now allowing genetic analysis to be applied to a number of classic animal model systems where transgenic and embryonic stem cell techniques are not well developed, but for which a wealth of detailed descriptive embryological information is available, or surgical manipulation is much more feasible. As well as exciting applications in developmental biology, in vivo DNA electrotransfer is also being used to transfer genes to skeletal muscle and drive expression of therapeutically active proteins, and to examine exogenous gene and protein function in normal adult cells situated within the complex environment of a tissue and organ system in vivo. Thus, in effect providing the in vivo equivalent of the in vitro transient transfection assay. As the widespread use of in vivo electroporation has really only just begun, it is likely that the future will hold many more applications for this technology in basic research, biotechnology and clinical research areas.
Resumo:
In an overview of some of the central issues concerning the impact and effects of new technology in adolescence, this article questions the reality of the net generation before considering the interplay of new and old technologies, the internet as both communication and lifestyle resource, and newer technologies like text messaging and webcams.
Resumo:
Throughout the latter months of 2000 and early 2001, the Australian public, media and parliament were engaged in a long and emotive debate about motherhood. This debate constructed the two main protagonists, the unborn 'child' and the potential mother, with a variety of different and often oppositional identities. The article looks at the way that these subject identities interacted during the debate, starting from the premise that policy making has unintended and unacknowledged material outcomes, and using governmentality as a tool through which to analyse and understand processes of identity manipulation and resistance within policy making. The recent debate concerning the right of lesbian and single women to access new reproductive technologies in Australia is used as a case study. Nominally the debate was about access to IVF technology; in reality, however, the debate was about the governing of women and, in particular, the governing of motherhood identities. The article focuses on the parliamentary debate over the drafting of legislation designed to stop lesbian and single women from accessing these technologies, particularly the utilization of the 'unborn' subject within these debates as a device to discipline the identity of 'mother'.
Resumo:
With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT) 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.
Resumo:
In this paper we wish to illustrate different perspectives used to create Multiple-Choice questions and we will show how we can improve these in the construction of math tests. As it is known, web technologies have a great influence on student’s behaviour. Based on an on-line project beginning at 2007 which has been contributing to help students on their individual work, we would like to share our experience and thoughts with colleagues who have a common concern when they have the task of constructing Multiple-Choice tests. We feel that Multiple-Choice tests play an important and a very useful supporting role in selfevaluation or self-examination of our students. Nonetheless, good Multiple–Choice Test Items are generally more complex and time-consuming to create than other types of tests. It requires a certain amount of skill. However, this skill maybe increases through study, practice and experience. This paper discusses a number of issues related to the use of Multiple-Choice questions, lists the advantages and disadvantages of this question format contrasting it with open questions. Some examples are given in this context.
Resumo:
The purpose of this paper is to analyse if Multiple-Choice Tests may be considered an interesting alternative for assessing knowledge, particularly in the Mathematics area, as opposed to the traditional methods, such as open questions exams. In this sense we illustrate some opinions of the researchers in this area. Often the perception of the people about the construction of this kind of exams is that they are easy to create. But it is not true! Construct well written tests it’s a hard work and needs writing ability from the teachers. Our proposal is analyse the construction difficulties of multiple - choice tests as well some advantages and limitations of this type of tests. We also show the frequent critics and worries, since the beginning of this objective format usage. Finally in this context some examples of Multiple-Choice Items in the Mathematics area are given, and we illustrate as how we can take advantage and improve this kind of tests.
Resumo:
In smart grids context, the distributed generation units based in renewable resources, play an important rule. The photovoltaic solar units are a technology in evolution and their prices decrease significantly in recent years due to the high penetration of this technology in the low voltage and medium voltage networks supported by governmental policies and incentives. This paper proposes a methodology to determine the maximum penetration of photovoltaic units in a distribution network. The paper presents a case study, with four different scenarios, that considers a 32-bus medium voltage distribution network and the inclusion storage units.
Resumo:
The constant evolution of the Internet and its increasing use and subsequent entailing to private and public activities, resulting in a strong impact on their survival, originates an emerging technology. Through cloud computing, it is possible to abstract users from the lower layers to the business, focusing only on what is most important to manage and with the advantage of being able to grow (or degrades) resources as needed. The paradigm of cloud arises from the necessity of optimization of IT resources evolving in an emergent and rapidly expanding and technology. In this regard, after a study of the most common cloud platforms and the tactic of the current implementation of the technologies applied at the Institute of Biomedical Sciences of Abel Salazar and Faculty of Pharmacy of Oporto University a proposed evolution is suggested in order adorn certain requirements in the context of cloud computing.
Resumo:
Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.
Resumo:
Many-core platforms based on Network-on-Chip (NoC [Benini and De Micheli 2002]) present an emerging technology in the real-time embedded domain. Although the idea to group the applications previously executed on separated single-core devices, and accommodate them on an individual many-core chip offers various options for power savings, cost reductions and contributes to the overall system flexibility, its implementation is a non-trivial task. In this paper we address the issue of application mapping onto a NoCbased many-core platform when considering fundamentals and trends of current many-core operating systems, specifically, we elaborate on a limited migrative application model encompassing a message-passing paradigm as a communication primitive. As the main contribution, we formulate the problem of real-time application mapping, and propose a three-stage process to efficiently solve it. Through analysis it is assured that derived solutions guarantee the fulfilment of posed time constraints regarding worst-case communication latencies, and at the same time provide an environment to perform load balancing for e.g. thermal, energy, fault tolerance or performance reasons.We also propose several constraints regarding the topological structure of the application mapping, as well as the inter- and intra-application communication patterns, which efficiently solve the issues of pessimism and/or intractability when performing the analysis.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Mestrado Integrado em Engenharia Química e Bioquímica
Resumo:
Fractional calculus (FC) is currently being applied in many areas of science and technology. In fact, this mathematical concept helps the researches to have a deeper insight about several phenomena that integer order models overlook. Genetic algorithms (GA) are an important tool to solve optimization problems that occur in engineering. This methodology applies the concepts that describe biological evolution to obtain optimal solution in many different applications. In this line of thought, in this work we use the FC and the GA concepts to implement the electrical fractional order potential. The performance of the GA scheme, and the convergence of the resulting approximation, are analyzed. The results are analyzed for different number of charges and several fractional orders.