281 resultados para EXPONENTIALLY EXPANDING MESH
Resumo:
There are emerging movements in several countries to improve policy and practice to protect children from exposure to domestic violence. These movements have resulted in the collection of new data on EDV and the design and implementation of new child welfare policies and practices. To assist with the development of child welfare practice, this article summarizes current knowledge on the prevalence of EDV, and on child welfare services policies and practices that may hold promise for reducing the frequency and impact of EDV on children. We focus on Australia, Canada, and the United States, as these countries share a similar socio-legal context, a long history of enacting and expanding legislation about reporting of maltreatment, debates regarding the application of reporting laws to EDV, and new child welfare practices that show promise for responding more effectively to EDV.
Resumo:
Genetically modified or engineered foods are produced from rapidly expanding technologies that have sparked international debates and concerns about health and safety. These concerns focus on the potential dangers to human health, the risks of genetic pollution, and the demise of alternative farming techniques as well as biopiracy and economic exploitation by large private corporations. This article discusses the findings of the world's first Royal Commission on Genetic Modification conducted in New Zealand and reveals that there are potential social, ecological and economic risks created by genetically modified foods that require closer criminological scrutiny. As contemporary criminological discourses continue to push new boundaries in areas of crimes of the economy, environmental pollution, risk management, governance and globalization, the potential concerns posed by genetically modified foods creates fertile ground for criminological scholarship and activism.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
Blogs and other online platforms for personal writing such as LiveJournal have been of interest to researchers across the social sciences and humanities for a decade now. Although growth in the uptake of blogging has stalled somewhat since the heyday of blogs in the early 2000s, blogging continues to be a major genre of Internet-based communication. Indeed, at the same time that mass participation has moved on to Facebook, Twitter, and other more recent communication phenomena, what has been left behind by the wave of mass adoption is a slightly smaller but all the more solidly established blogosphere of engaged and committed participants. Blogs are now an accepted part of institutional, group, and personal communications strategies (Bruns and Jacobs, 2006); in style and substance, they are situated between the more static information provided by conventional Websites and Webpages and the continuous newsfeeds provided through Facebook and Twitter updates. Blogs provide a vehicle for authors (and their commenters) to think through given topics in the space of a few hundred to a few thousand words – expanding, perhaps, on shorter tweets, and possibly leading to the publication of more fully formed texts elsewhere. Additionally, they are also a very flexible medium: they readily provide the functionality to include images, audio, video, and other additional materials – as well as the fundamental tool of blogging, the hyperlink itself. Indeed, the role of the link in blogs and blog posts should not be underestimated. Whatever the genre and topic that individual bloggers engage in, for the most part blogging is used to provide timely updates and commentary – and it is typical for such material to link both to relevant posts made by other bloggers, and to previous posts by the present author, both to background material which provides readers with further information about the blogger’s current topic, and to news stories and articles which the blogger found interesting or worthy of critique. Especially where bloggers are part of a larger community of authors sharing similar interests or views (and such communities are often indicated by the presence of yet another type of link – in blogrolls, often in a sidebar on the blog site, which list the blogger’s friends or favourites), then, the reciprocal writing and linking of posts often constitutes an asynchronous, distributed conversation that unfolds over the course of days, weeks, and months. Research into blogs is interesting for a variety of reasons, therefore. For one, a qualitative analysis of one or several blogs can reveal the cognitive and communicative processes through which individual bloggers define their online identity, position themselves in relation to fellow bloggers, frame particular themes, topics and stories, and engage with one another’s points of view. It may also shed light on how such processes may differ across different communities of interest, perhaps in correlation with the different societal framing and valorisation of specific areas of interest, with the socioeconomic backgrounds of individual bloggers, or with other external or internal factors. Such qualitative research now looks back on a decade-long history (for key collections, see Gurak, et al., 2004; Bruns and Jacobs, 2006; also see Walker Rettberg, 2008) and has recently shifted also to specifically investigate how blogging practices differ across different cultures (Russell and Echchaibi, 2009). Other studies have also investigated the practices and motivations of bloggers in specific countries from a sociological perspective, through large-scale surveys (e.g. Schmidt, 2009). Blogs have also been directly employed within both K-12 and higher education, across many disciplines, as tools for reflexive learning and discussion (Burgess, 2006).
Resumo:
Volume measurements are useful in many branches of science and medicine. They are usually accomplished by acquiring a sequence of cross sectional images through the object using an appropriate scanning modality, for example x-ray computed tomography (CT), magnetic resonance (MR) or ultrasound (US). In the cases of CT and MR, a dividing cubes algorithm can be used to describe the surface as a triangle mesh. However, such algorithms are not suitable for US data, especially when the image sequence is multiplanar (as it usually is). This problem may be overcome by manually tracing regions of interest (ROIs) on the registered multiplanar images and connecting the points into a triangular mesh. In this paper we describe and evaluate a new discreet form of Gauss’ theorem which enables the calculation of the volume of any enclosed surface described by a triangular mesh. The volume is calculated by summing the vector product of the centroid, area and normal of each surface triangle. The algorithm was tested on computer-generated objects, US-scanned balloons, livers and kidneys and CT-scanned clay rocks. The results, expressed as the mean percentage difference ± one standard deviation were 1.2 ± 2.3, 5.5 ± 4.7, 3.0 ± 3.2 and −1.2 ± 3.2% for balloons, livers, kidneys and rocks respectively. The results compare favourably with other volume estimation methods such as planimetry and tetrahedral decomposition.
Resumo:
Information Technology and its relationship to organisational performance has long been the interest of researchers. While there is concurrence that IT does contribute to performance, and we are efficiently expanding our knowledge on what factors cause better leveraging of IT resources in organisations, we have done little to understand how these factors interact with technology that results in improved performance. Using a structurational lens that recognises the recursive interaction between technology and people in the presence of social practices, and the norms that inform their ongoing practices, we propose an ethnographic approach to understanding the interaction between technology and resources, aiming to provide richer insight on the nature of the environment that promotes better use of IT resources. Such insights could provide the IT users with at least an initial conception of the IT usage platform that they could promote in their organisations to leverage the most from their IT resources.
Resumo:
Organisations devote substantial resources to acquire information technology (IT), and explaining the important issue of how IT can affect performance has posed a significant challenge to information system (IS) researchers. Owing to the importance of expanding our understanding on how and where IT and IT-related resources impact organisational performance, this study investigates the differential effects of IT resources and IT-related capabilities, in the presence of platform-related complementarities, on business process performance. We test these relationships empirically via a field survey of 216 firms. The findings suggest that IT resources and IT-related capabilities explain variance in performance. Of interest is the finding that IT resources and IT-related capabilities ability to explain variance in business process is further enhanced by the presence of the platform-related complementarities. Our findings are largely consistent with the resource-based and complementarity arguments of sources of IT-related business value.
Resumo:
A new system is described for estimating volume from a series of multiplanar 2D ultrasound images. Ultrasound images are captured using a personal computer video digitizing card and an electromagnetic localization system is used to record the pose of the ultrasound images. The accuracy of the system was assessed by scanning four groups of ten cadaveric kidneys on four different ultrasound machines. Scan image planes were oriented either radially, in parallel or slanted at 30 C to the vertical. The cross-sectional images of the kidneys were traced using a mouse and the outline points transformed to 3D space using the Fastrak position and orientation data. Points on adjacent region of interest outlines were connected to form a triangle mesh and the volume of the kidneys estimated using the ellipsoid, planimetry, tetrahedral and ray tracing methods. There was little difference between the results for the different scan techniques or volume estimation algorithms, although, perhaps as expected, the ellipsoid results were the least precise. For radial scanning and ray tracing, the mean and standard deviation of the percentage errors for the four different machines were as follows: Hitachi EUB-240, −3.0 ± 2.7%; Tosbee RM3, −0.1 ± 2.3%; Hitachi EUB-415, 0.2 ± 2.3%; Acuson, 2.7 ± 2.3%.
Resumo:
A century ago, as the Western world embarked on a period of traumatic change, the visual realism of photography and documentary film brought print and radio news to life. The vision that these new mediums threw into stark relief was one of intense social and political upheaval: the birth of modernity fired and tempered in the crucible of the Great War. As millions died in this fiery chamber and the influenza pandemic that followed, lines of empires staggered to their fall, and new geo-political boundaries were scored in the raw, red flesh of Europe. The decade of 1910 to 1919 also heralded a prolific period of artistic experimentation. It marked the beginning of the social and artistic age of modernity and, with it, the nascent beginnings of a new art form: film. We still live in the shadow of this violent, traumatic and fertile age; haunted by the ghosts of Flanders and Gallipoli and its ripples of innovation and creativity. Something happened here, but to understand how and why is not easy; for the documentary images we carry with us in our collective cultural memory have become what Baudrillard refers to as simulacra. Detached from their referents, they have become referents themselves, to underscore other, grand narratives in television and Hollywood films. The personal histories of the individuals they represent so graphically–and their hope, love and loss–are folded into a national story that serves, like war memorials and national holidays, to buttress social myths and values. And, as filmic images cross-pollinate, with each iteration offering a new catharsis, events that must have been terrifying or wondrous are abstracted. In this paper we first discuss this transformation through reference to theories of documentary and memory–this will form a conceptual framework for a subsequent discussion of the short film Anmer. Produced by the first author in 2010, Anmer is a visual essay on documentary, simulacra and the symbolic narratives of history. Its form, structure and aesthetic speak of the confluence of documentary, history, memory and dream. Located in the first decade of the twentieth century, its non-linear narratives of personal tragedy and poetic dreamscapes are an evocative reminder of the distance between intimate experience, grand narratives, and the mythologies of popular films. This transformation of documentary sources not only played out in the processes of the film’s production, but also came to form its theme.
Resumo:
EMR (Electronic Medical Record) is an emerging technology that is highly-blended between non-IT and IT area. One methodology is to link the non-IT and IT area is to construct databases. Nowadays, it supports before and after-treatment for patients and should satisfy all stakeholders such as practitioners, nurses, researchers, administrators and financial departments and so on. In accordance with the database maintenance, DAS (Data as Service) model is one solution for outsourcing. However, there are some scalability and strategy issues when we need to plan to use DAS model properly. We constructed three kinds of databases such as plan-text, MS built-in encryption which is in-house model and custom AES (Advanced Encryption Standard) - DAS model scaling from 5K to 2560K records. To perform custom AES-DAS better, we also devised Bucket Index using Bloom Filter. The simulation showed the response times arithmetically increased in the beginning but after a certain threshold, exponentially increased in the end. In conclusion, if the database model is close to in-house model, then vendor technology is a good way to perform and get query response times in a consistent manner. If the model is DAS model, it is easy to outsource the database, however, some techniques like Bucket Index enhances its utilization. To get faster query response times, designing database such as consideration of the field type is also important. This study suggests cloud computing would be a next DAS model to satisfy the scalability and the security issues.
Resumo:
Electronic Health Record (EHR) retrieval processes are complex demanding Information Technology (IT) resources exponentially in particular memory usage. Database-as-a-service (DAS) model approach is proposed to meet the scalability factor of EHR retrieval processes. A simulation study using ranged of EHR records with DAS model was presented. The bucket-indexing model incorporated partitioning fields and bloom filters in a Singleton design pattern were used to implement custom database encryption system. It effectively provided faster responses in the range query compared to different types of queries used such as aggregation queries among the DAS, built-in encryption and the plain-text DBMS. The study also presented with constraints around the approach should consider for other practical applications.
Resumo:
Divergence dating studies, which combine temporal data from the fossil record with branch length data from molecular phylogenetic trees, represent a rapidly expanding approach to understanding the history of life. National Evolutionary Synthesis Center hosted the first Fossil Calibrations Working Group (3–6 March, 2011, Durham, NC, USA), bringing together palaeontologists, molecular evolutionists and bioinformatics experts to present perspectives from disciplines that generate, model and use fossil calibration data. Presentations and discussions focused on channels for interdisciplinary collaboration, best practices for justifying, reporting and using fossil calibrations and roadblocks to synthesis of palaeontological and molecular data. Bioinformatics solutions were proposed, with the primary objective being a new database for vetted fossil calibrations with linkages to existing resources, targeted for a 2012 launch.
Resumo:
The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.
Resumo:
A traditional approach centred on weekly lectures, perhaps supported by a tutorial programme, still predominates in modern legal education in Australia. This approach tends to focus on the transmission of knowledge about legal rules and doctrine to students who adopt a largely passive role. Criticisms of the traditional approach have led to law schools expanding their curricula to include the teaching of skills, including the skill of negotiation and an appreciation of legal ethics and professional responsibility. However, in a climate of limited government funding for law schools in Australia, innovation in legal education remains a challenge. This paper considers the successful use of Second Life machinima in two programs, Air Gondwana and Entry into Valhalla and their part in the creation of engaging, effective learning environments. These programs not only engage students in active learning but also facilitate flexibility in their studies and other benefits. The programs yield important lessons concerning the use of machinima innovations in curricula, not only for academics involved in legal education but also those in other disciplines, especially those that rely on traditional passive lectures in their teaching and learning approaches.
Resumo:
Companies face the challenges of expanding their markets, improving products, services and processes, and exploiting intellectual capital in a dynamic network. Therefore, more companies are turning to an Enterprise System (ES). Knowledge management (KM) has also received considerable attention and is continuously gaining the interest of industry, enterprises, and academia. For ES, KM can provide support across the entire lifecycle, from selection and implementation to use. In addition, it is also recognised that an ontology is an appropriate methodology to accomplish a common consensus of communication, as well as to support a diversity of KM activities, such as knowledge repository, retrieval, sharing, and dissemination. This paper examines the role of ontology-based KM for ES (OKES) and investigates the possible integration of ontology-based KM and ES. The authors develop a taxonomy as a framework for understanding OKES research. In order to achieve the objective of this study, a systematic review of existing research was conducted. Based on a theoretical framework of the ES lifecycle, KM, KM for ES, ontology, and ontology-based KM, guided by the framework of study, a taxonomy for OKES is established.