370 resultados para co-presence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article Caroline Heim explores an avenue for the audience's contribution to the theatrical event that has emerged as increasingly important over the past decade: postperformance discussions. With the exception of theatres that actively encourage argument such as the Staatstheater Stuttgart, most extant audience discussions in Western mainstream theatres privilege the voice of the theatre expert. Caroline Heim presents case studies of post-performance discussions held after performances of Anne of the Thousand Days and Who's Afraid of Virginia Woolf? which trialled a new model of audience co-creation. An audience text which informs the theatrical event was created, and a new role, that of audience critic, established in the process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The deal value of private equity merger and takeover activity has achieved unprecedented growth in the last couple of years, in Australia and globally. Private equity deals are not a new feature of the market; however, such deals have been subject to increased academic, professional and policy interest. This study examines the particular features of 15 major deals involving listed company "targets" and provides evidence – based on a comparison with a benchmark sample – to demonstrate the role that private equity plays in the market for corporate control. The objective of this study was to assess the friendliness of private equity bids. Based on the indicia compiled, lower bid premiums, the presence of break fees and the intention to retain senior management are compellingly different for private equity bids than for the comparative sample of bids. Using these several characteristics of "friendliness", the authors show that private equity deals are generally friendly in nature, consistent with industry rhetoric, but perhaps inconsistent with the popular belief that private equity bidders are the "barbarians at the gate".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sorghum (Sorghum bicolor (L.) Moench) is the world’s fifth major cereal crop and holds importance as a construction material, food and fodder source. More recently, the potential of this plant as a biofuel source has been noted. Despite its agronomic importance, the use of sorghum production is being constrained by both biotic and abiotic factors. These challenges could be addressed by the use of genetic engineering strategies to complement conventional breeding techniques. However, sorghum is one of the most recalcitrant crops for genetic modification with the lack of an efficient tissue culture system being amongst the chief reasons. Therefore, the aim of this study was to develop an efficient tissue culture system for establishing regenerable embryogenic cell lines, micropropagation and acclimatisation for Sorghum bicolor and use this to optimise parameters for genetic transformation via Agrobacterium-mediated transformation and microprojectile bombardment. Using five different sorghum cultivars, SA281, 296B, SC49, Wray and Rio, numerous parameters were investigated in an attempt to establish an efficient and reproducible tissue culture and transformation system. Using immature embryos (IEs) as explants, regenerable embryogenic cell lines (ECLs) could only be established from cultivars SA281 and 296B. Large amounts of phenolics were produced from IEs of cultivars, SC49, Wary and Rio, and these compounds severely hindered callus formation and development. Cultivar SA281 also produced phenolics during regeneration. Attempts to suppress the production of these compounds in cultivars SA281 and SC49 using activated charcoal, PVP, ascorbic acid, citric acid and liquid filter paper bridge methods were either ineffective or had a detrimental effect on embryogenic callus formation, development and regeneration. Immature embryos sourced during summer were found to be far more responsive in vitro than those sourced during winter. In an attempt to overcome this problem, IEs were sourced from sorghum grown under summer conditions in either a temperature controlled glasshouse or a growth chamber. However, the performance of these explants was still inferior to that of natural summer-sourced explants. Leaf whorls, mature embryos, shoot tips and leaf primordia were found to be unsuitable as explants for establishing ECLs in sorghum cultivars SA281 and 296B. Using the florets of immature inflorescences (IFs) as explants, however, ECLs were established and regenerated for these cultivars, as well as for cultivar Tx430, using callus induction media, SCIM, and regeneration media, VWRM. The best in vitro responses, from the largest possible sized IFs, were obtained using plants at the FL-2 stage (where the last fully opened leaf was two leaves away from the flag leaf). Immature inflorescences could be stored at 25oC for up to three days without affecting their in vitro responses. Compared to IEs, the IFs were more robust in tissue culture and showed responses which were season and growth condition independent. A micropropagation protocol for sorghum was developed in this study. The optimum plant growth regulator (PGR) combination for the micropropagation of in vitro regenerated plantlets was found to be 1.0 mg/L BAP in combination with 0.5 mg/L NAA. With this protocol, cultivars 296B and SA281 produced an average of 57 and 13 off-shoots per plantlet, respectively. The plantlets were successfully acclimatised and developed into phenotypically normal plants that set seeds. A simplified acclimatisation protocol for in vitro regenerated plantlets was also developed. This protocol involved deflasking in vitro plantlets with at least 2 fully-opened healthy leaves and at least 3 roots longer than 1.5 cm, washing the media from the roots with running tap water, planting in 100 mm pots and placing in plastic trays covered with a clear plastic bag in a plant growth chamber. After seven days, the corners of the plastic cover were opened and the bags were completely removed after 10 days. All plantlets were successfully acclimatised regardless of whether 1:1 perlite:potting mix, potting mix, UC mix or vermiculite were used as potting substrates. Parameters were optimised for Agrobacterium-mediated transformation (AMT) of cultivars SA281, 296B and Tx430. The optimal conditions were the use of Agrobacterium strain LBA4404 at an inoculum density of 0.5 OD600nm, heat shock at 43oC for 3 min, use of the surfactant Pluronic F-68 (0.02% w/v) in the inoculation media with a pH of 5.2 and a 3 day co-cultivation period in dark at 22oC. Using these parameters, high frequencies of transient GFP expression was observed in IEs precultured on callus initiation media for 1-7 days as well as in four weeks old IE- and IF-derived callus. Cultivar SA281 appeared very sensitive to Agrobacterium since all tissue turned necrotic within two weeks post-exposure. For cultivar 296B, GFP expression was observed up to 20 days post co-cultivation but no stably transformed plants were regenerated. Using cultivar Tx430, GFP was expressed for up to 50 days post co-cultivation. Although no stably transformed plants of this cultivar were regenerated, this was most likely due to the use of unsuitable regeneration media. Parameters were optimised for transformation by particle bombardment (PB) of cultivars SA281, 296B and Tx430. The optimal conditions were use of 3-7 days old IEs and 4 weeks old IF callus, 4 hour pre- and post-bombardment osmoticum treatment, use of 0.6 µm gold microparticles, helium pressure of 1500 kPa and target distance of 15 cm. Using these parameters for PB, transient GFP expression was observed for up to 14, 30 and 50 days for cultivars SA281, 296B and Tx430, respectively. Further, the use of PB resulted in less tissue necrosis compared to AMT for the respective cultivars. Despite the presence of transient GFP expression, no stably transformed plants were regenerated. The establishment of regenerable ECLs and the optimization of AMT and PB parameters in this study provides a platform for future efforts to develop an efficient transformation protocol for sorghum. The development of GM sorghum will be an important step towards improving its agronomic properties as well as its exploitation for biofuel production.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While governments are engaged in developing social policy responses to address wicked issues such as poverty, homelessness, drug addiction and crime, long term resolution of these issues through government policy making and state-based programmatic action has remained elusive. The use of vehicles for joint action and partnership between government and the community sector such as co-management has been offered as a way of harnessing productive capability and innovative capacity of both these sectors to resolve these complex problems. However, it is suggested that while there is a well advanced agenda with the intent for collaboration and partnership, working with the models for undertaking this joint action are not well understood and have not been fully developed or evaluated. This chapter examines new approaches to resolving the wicked issue of homelessness through applying the lens of co-management to understand the complexities of this issue and its resolution. The chapter analyses an attempt to move away from traditional bureaucratic structures of welfare departments, operating through single functional ‘silos’ to a new horizontal ‘hub-based’ model of service delivery that seeks to integrate actors across many different service areas and organizations. The chapter explores case studies of co-management in the establishment, development and operation of service hubs to address homelessness. We argue that the response to homelessness needs a ‘wicked solution’ that goes beyond simply providing shelter to those in need. The case of the hub models of community sector organizations working across organizational boundaries is evaluated to determine whether this approach can be considered successful co-managing of an innovative initiative, and understanding the requirements for developing, improving and extending this model. The role of the third sector in co-managing public services is examined through the in-depth case studies and the results are presented together with an assessment of how co-management can contribute to service quality and service management in public services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The possibility of a surface inner sphere electron transfer mechanism leading to the coating of gold via the surface reduction of gold(I) chloride on metal and semi-metal oxide nanoparticles was investigated. Silica and zinc oxide nanoparticles are known to have very different surface chemistry, potentially leading to a new class of gold coated nanoparticles. Monodisperse silica nanoparticles were synthesised by the well known Stöber protocol in conjunction with sonication. The nanoparticle size was regulated solely by varying the amount of ammonia solution added. The presence of surface hydroxyl groups was investigated by liquid proton NMR. The resultant nanoparticle size was directly measured by the use of TEM. The synthesised silica nanoparticles were dispersed in acetonitrile (MeCN) and added to a bis acetonitrile gold(I) co-ordination complex [Au(MeCN)2]+ in MeCN. The silica hydroxyl groups were deprotonated in the presence of MeCN generating a formal negative charge on the siloxy groups. This allowed the [Au(MeCN)2]+ complex to undergo ligand exchange with the silica nanoparticles, which formed a surface co-ordination complex with reduction to gold(0), that proceeded by a surface inner sphere electron transfer mechanism. The residual [Au(MeCN)2]+ complex was allowed to react with water, disproportionating into gold(0) and gold(III) respectively, with gold(0) being added to the reduced gold already bound on the silica surface. The so-formed metallic gold seed surface was found to be suitable for the conventional reduction of gold(III) to gold(0) by ascorbic acid. This process generated a thin and uniform gold coating on the silica nanoparticles. This process was modified to include uniformly gold coated composite zinc oxide nanoparticles (Au@ZnO NPs) using surface co-ordination chemistry. AuCl dissolved in acetonitrile (MeCN) supplied chloride ions which were adsorbed onto ZnO NPs. The co-ordinated gold(I) was reduced on the ZnO surface to gold(0) by the inner sphere electron transfer mechanism. Addition of water disproportionated the remaining gold(I) to gold(0) and gold(III). Gold(0) bonded to gold(0) on the NP surface with gold(III) was reduced to gold(0) by ascorbic acid (ASC), which completed the gold coating process. This gold coating process of Au@ZnO NPs was modified to incorporate iodide instead of chloride. ZnO NPs were synthesised by the use of sodium oxide, zinc iodide and potassium iodide in refluxing basic ethanol with iodide controlling the presence of chemisorbed oxygen. These ZnO NPs were treated by the addition of gold(I) chloride dissolved in acetonitrile leaving chloride anions co-ordinated on the ZnO NP surface. This allowed acetonitrile ligands in the added [Au(MeCN)2]+ complex to surface exchange with adsorbed chloride from the dissolved AuCl on the ZnO NP surface. Gold(I) was then reduced by the surface inner sphere electron transfer mechanism. The presence of the reduced gold on the ZnO NPs allowed adsorption of iodide to generate a uniform deposition of gold onto the ZnO NP surface without the use of additional reducing agents or heat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A ground-based tracking camera and co-aligned slit-less spectrograph were used to measure the spectral signature of visible radiation emitted from the Hayabusa capsule as it entered into the Earth's atmosphere in June 2010. Good quality spectra were obtained that showed the presence of radiation from the heat shield of the vehicle and the shock-heated air in front of the vehicle. An analysis of the black body nature of the radiation concluded that the peak average temperature of the surface was about (3100±100) K.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User needs are a fundamental element of design. If the design process does not properly reflect user needs, the design will be severely compromised. Therefore, it is worthwhile to investigate how the user is, and user needs are, understood in the design process. In this article, three accepted linear process models for web site and interactive media design are reviewed in terms of the designer and user participation. The article then proposes a user-evolving collaborative design process which is built on co-creation activities between designer and user. Co-creation activities across the entire design process structurally and ontologically reposition the users, and user needs, centrally, which allows the designers to holistically approach to the user needs through building a partnership with the users. Co-creation creates an equal evolving participatory process between user and designer towards sharing values and knowledge and creating new domains of collective creativity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A composite SaaS (Software as a Service) is a software that is comprised of several software components and data components. The composite SaaS placement problem is to determine where each of the components should be deployed in a cloud computing environment such that the performance of the composite SaaS is optimal. From the computational point of view, the composite SaaS placement problem is a large-scale combinatorial optimization problem. Thus, an Iterative Cooperative Co-evolutionary Genetic Algorithm (ICCGA) was proposed. The ICCGA can find reasonable quality of solutions. However, its computation time is noticeably slow. Aiming at improving the computation time, we propose an unsynchronized Parallel Cooperative Co-evolutionary Genetic Algorithm (PCCGA) in this paper. Experimental results have shown that the PCCGA not only has quicker computation time, but also generates better quality of solutions than the ICCGA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As much as victims have been absent in traditional and national criminal justice for a long time, they were invisible in transitional and international criminal justice after World War II. The Nuremberg Trials were dominated by the perpetrators, and documents were mainly used instead of victim testimony. Contemporaries shared the perspective that transitional justice, both international and national procedures should channel revenge by the victims and their families into the more peaceful venues of courts and legal procedures. Since then, victims have gained an ever more important role in transitional, post-conflict and international criminal justice. Non-judicial tribunals, Truth and Reconciliation Commissions, and international criminal courts and tribunals are relying on the testimony of victims and thus provide a prominent role for victims who often take centre stage in such procedures and trials. International criminal law and the human rights regime have provided victims with several routes to make themselves heard and fight against impunity. This paper tracks the road from absence to presence, and from invisibility to the visibility of victims during the second half of the last and the beginning of the present century. It shows in which ways their presence has shaped and changed transitional and international justice, and in particular how their absence or presence is linked to amnesties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers VECMs for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration between the permanent components of series reduces the rank of the long-run multiplier matrix, a common feature among the transitory components leads to a rank reduction in the matrix summarizing short-run dynamics. The common feature also implies that there exists linear combinations of the first-differenced variables in a cointegrated VAR that are white noise and traditional tests focus on testing for this characteristic. An alternative, however, is to test the rank of the short-run dynamics matrix directly. Consequently, we use the literature on testing the rank of a matrix to produce some alternative test statistics. We also show that these are identical to one of the traditional tests. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to re-examine an existing empirical study. Finally, this approach is applied to provide a check for the presence of common dynamics in DSGE models.