194 resultados para PERFECT NASH EQUILIBRIA
Resumo:
It is now widely acknowledged that student mental well-being is a critical factor in the tertiary student learning experience and is important to student learning success. The issue of student mental well-being also has implications for effective student transition out of university and into the world of work. It is therefore vital that intentional strategies are adopted by universities both within the formal curriculum, and outside it, to promote student well-being and to work proactively and preventatively to avoid a decline in student psychological well-being. This paper describes how the Queensland University of Technology Law School is using animation to teach students about the importance for their learning success of the protection of their mental well-being. Mayer and Moreno (2002) define an animation as an external representation with three main characteristics: (1) it is a pictorial representation, (2) it depicts apparent movement, and (3) it consists of objects that are artificially created through drawing or some other modelling technique. Research into the effectiveness of animation as a tool for tertiary student learning engagement is relatively new and growing field of enquiry. Nash argues, for example, that animations provide a “rich, immersive environment [that] encourages action and interactivity, which overcome an often dehumanizing learning management system approach” (Nash, 2009, 25). Nicholas states that contemporary millennial students in universities today, have been immersed in animated multimedia since their birth and in fact need multimedia to learn and communicate effectively (2008). However, it has also been established, for example through the work of Lowe (2003, 2004, 2008) that animations can place additional perceptual, attentional, and cognitive demands on students that they are not always equipped to cope with. There are many different genres of animation. The dominant style of animation used in the university learning environment is expository animation. This approach is a useful tool for visualising dynamic processes and is used to support student understanding of subjects and themes that might otherwise be perceived as theoretically difficult and disengaging. It is also a form of animation that can be constructed to avoid any potential negative impact on cognitive load that the animated genre might have. However, the nature of expository animation has limitations for engaging students, and can present as clinical and static. For this reason, the project applied Kombartzky, Ploetzner, Schlag, and Metz’s (2010) cognitive strategy for effective student learning from expository animation, and developed a hybrid form of animation that takes advantage of the best elements of expository animation techniques along with more engaging short narrative techniques. First, the paper examines the existing literature on the use of animation in tertiary educational contexts. Second, the paper describes how animation was used at QUT Law School to teach students about the issue of mental well-being and its importance to their learning success. Finally, the paper analyses the potential of the use of animation, and of the cognitive strategy and animation approach trialled in the project, as a teaching tool for the promotion of student learning about the importance of mental well-being.
Resumo:
To evaluate the underreporting rate of death -cause data in Shandong province during 2012 to 2013 by capture -mark -recapture method and to provide the base for health strategy. Methods All counties were divided into 5 stratifications according the death rates of 2012, and 14 counties were selected, then 3 towns or streets were selected in each country, 10 villages or neighborhood committees were selected in each town (street). The death data collected from security bureau and civil affairs bureau were compared with the reporting death data from the National Cause of Death Surveillance, and the underreporting rate was calculated. Results In present study, 6 929 death cases were collected, it was found that 1 556 cases were underreported. The death cases estimated by CMR method were 6 227 cases (95%CI: 7 593-7 651), and the average underreporting rate was 23.15%. There were significantly differences between different stratifications (P<0.01). The underreporting rate in 0-4 years old group was 56.93%, the male underreporting rate was 22.31% and the female underreporting rate was 24.09%. There was no significant difference between male and female groups (P>0.05). Conclusion There is an obvious underreport in the cause of death surveillance of Shandong province, and the underreporting rates are different among the 5 stratifications. The underreporting rate is higher in 0-4 years old group, and the investigation of the death cause surveillance for young residents is not perfect in some countries. The investigation quality of the death cause surveillance should be improved, increasing the integrity of the report data and adjusting the mortalities in different stratifications for obtaining a accurate mortality in Shandong province.
Resumo:
The identification of molecular networks at the system level in mammals is accelerated by next-generation mammalian genetics without crossing, which requires both the efficient production of whole-body biallelic knockout (KO) mice in a single generation and high-performance phenotype analyses. Here, we show that the triple targeting of a single gene using the CRISPR/Cas9 system achieves almost perfect KO efficiency (96%–100%). In addition, we developed a respiration-based fully automated noninvasive sleep phenotyping system, the Snappy Sleep Stager (SSS), for high-performance (95.3% accuracy) sleep/wake staging. Using the triple-target CRISPR and SSS in tandem, we reliably obtained sleep/wake phenotypes, even in double-KO mice. By using this system to comprehensively analyze all of the N-methyl-D-aspartate (NMDA) receptor family members, we found Nr3a as a short-sleeper gene, which is verified by an independent set of triple-target CRISPR. These results demonstrate the application of mammalian reverse genetics without crossing to organism-level systems biology in sleep research.
Resumo:
We followed by X-ray Photoelectron Spectroscopy (XPS) the time evolution of graphene layers obtained by annealing 3C SiC(111)/Si(111) crystals at different temperatures. The intensity of the carbon signal provides a quantification of the graphene thickness as a function of the annealing time, which follows a power law with exponent 0.5. We show that a kinetic model, based on a bottom-up growth mechanism, provides a full explanation to the evolution of the graphene thickness as a function of time, allowing to calculate the effective activation energy of the process and the energy barriers, in excellent agreement with previous theoretical results. Our study provides a complete and exhaustive picture of Si diffusion into the SiC matrix, establishing the conditions for a perfect control of the graphene growth by Si sublimation.
Resumo:
There is an increased interest in the use of Unmanned Aerial Vehicles for load transportation from environmental remote sensing to construction and parcel delivery. One of the main challenges is accurate control of the load position and trajectory. This paper presents an assessment of real flight trials for the control of an autonomous multi-rotor with a suspended slung load using only visual feedback to determine the load position. This method uses an onboard camera to take advantage of a common visual marker detection algorithm to robustly detect the load location. The load position is calculated using an onboard processor, and transmitted over a wireless network to a ground station integrating MATLAB/SIMULINK and Robotic Operating System (ROS) and a Model Predictive Controller (MPC) to control both the load and the UAV. To evaluate the system performance, the position of the load determined by the visual detection system in real flight is compared with data received by a motion tracking system. The multi-rotor position tracking performance is also analyzed by conducting flight trials using perfect load position data and data obtained only from the visual system. Results show very accurate estimation of the load position (~5% Offset) using only the visual system and demonstrate that the need for an external motion tracking system is not needed for this task.
Resumo:
This paper provides an empirical estimation of energy efficiency and other proximate factors that explain energy intensity in Australia for the period 1978-2009. The analysis is performed by decomposing the changes in energy intensity by means of energy efficiency, fuel mix and structural changes using sectoral and sub-sectoral levels of data. The results show that the driving forces behind the decrease in energy intensity in Australia are efficiency effect and sectoral composition effect, where the former is found to be more prominent than the latter. Moreover, the favourable impact of the composition effect has slowed consistently in recent years. A perfect positive association characterizes the relationship between energy intensity and carbon intensity in Australia. The decomposition results indicate that Australia needs to improve energy efficiency further to reduce energy intensity and carbon emissions. © 2012 Elsevier Ltd.
Resumo:
Non-government actors such as think-tanks are playing an important role in Australian policy work. As governments increasingly outsource policy work previously done by education departments and academics to these new policy actors, more think-tanks have emerged that represent a wide range of political views and ideological positions. This paper looks at the emergence of the Grattan Institute as one significant player in Australian education policy with a particular emphasis on Grattan’s report ‘Turning around low-performing schools’. Grattan exemplifies many of the facets of Barber’s ‘deliverology’, as they produce reports designed to be easily digested, simply actioned and provide reassurance that there is an answer, often through focusing on ‘what works’ recipes. ‘Turning around low-performing schools’ is a perfect example of this deliverology. However, a close analysis of the Report suggests that it contains four major problems which seriously impact its usefulness for schools and policymakers: it ignores data that may be more important in explaining the turn-around of schools, the Report is overly reliant on NAPLAN data, there are reasons to be suspicious about the evidence assembled, and finally the Report falls into a classic trap of logic—the post hoc fallacy.
Resumo:
This paper considers the second-best strategy of correcting a wide variety of trade distortions in a small open economy with perfect competition in all markets. Using the tools of duality, we obtain some general properties of the structure and the levels of the optimal taxlsubsidy rates. The paper also analyzes the welfare effects of unilateral piecemeal trade policy reforms when some of the quota distortions—imposed by the foreign countries—are unalterable. It is shown that the merits of unilateral trade policy reforms that are emphasized in the literature crucially depend on the absence of unalterable foreign imposed quotas.
Resumo:
This paper investigates the optimal choice of foreign aid when trade policies are decided in a non-cooperative fashion. Three alternative scenarios, depending on the timing of the actions and on whether aid is tied, are analyzed. It is shown that, in the case where aid is decided before tariffs, untied aid can lead to the reduction of the recipient's optimal trade tax. When the donor can tie the aid to a reduction in the recipient's tariff, the optimal aid level is always positive and the world can always achieve a Pareto-efficient equilibrium.
Resumo:
This issue marks the beginning of a new editorial cycle. In the seventh volume of the journal the editorial team will continue collating novel scientific and social developments in the broader field of ‘knowledge-based development’ to report to our readers. In this perspective, the first issue of the volume focuses on different dimensions of knowledge-based urban development. As Gabe et al. (2012, p.1179) indicate, “[i]t would be an understatement to suggest that knowledge plays a key role in today’s economy; for much of the developed world, it might be more accurate to assert that knowledge is today’s economy”. Thus, knowledge generation has been a priority for global city administrations, and there is a growing consensus amongst scholars, planners, politicians and industrialists in identifying knowledge-based urban development as a panacea to the burgeoning economic problems (Knight, 1995; Kunzmann, 2009; Yigitcanlar, 2010, 2011; Huggins and Strakova, 2012; Lönnqvist et al., 2014). Although, in the era of global knowledge economy, knowledge-based urban development is a critical factor for economic success (Pratt, 2000; Sheppard, 2002), it is not solely an economic policy. For many, knowledge-based urban development is a policy that targets building an urban setting to form perfect climates for business, people, and governance in an environmentally friendly atmosphere (Carrillo, 2006; Ergazakis et al., 2006; Angelidou et al., 2012). Each of these climates correspond to a dimension or domain of knowledge-based urban development – namely, economy, society, space, and governance (Carrillo et al., 2014). Each paper of this issue corresponds to at least one of these domains, or policy areas.
Resumo:
One of the least known compounds among transition metal dichalcogenides (TMDCs) is the layered triclinic technetium dichalcogenides (TcX2, X = S, Se). In this work, we systematically study the structural, mechanical, electronic, and optical properties of TcS2 and TcSe2 monolayers based on density functional theory (DFT). We find that TcS2 and TcSe2 can be easily exfoliated in a monolayer form because their formation and cleavage energy are analogous to those of other experimentally realized TMDCs monolayer. By using a hybrid DFT functional, the TcS2 and TcSe2 monolayers are calculated to be indirect semiconductors with band gaps of 1.91 and 1.69 eV, respectively. However, bilayer TcS2 exhibits direct-bandgap character, and both TcS2 and TcSe2 monolayers can be tuned from semiconductor to metal under effective tensile/compressive strains. Calculations of visible light absorption indicate that 2D TcS2 and TcSe2 generally possess better capability of harvesting sunlight compared to single-layer MoS2 and ReSe2, implying their potential as excellent light-absorbers. Most interestingly, we have discovered that the TcSe2 monolayer is an excellent photocatalyst for splitting water into hydrogen due to the perfect fit of band edge positions with respect to the water reduction and oxidation potentials. Our predictions expand the two-dimensional (2D) family of TMDCs, and the remarkable electronic/optical properties of monolayer TcS2 and TcSe2 will place them among the most promising 2D TMDCs for renewable energy application in the future.
Resumo:
Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.
Resumo:
Recognized around the world as a powerful beacon for freedom, hope, and opportunity, the Statue of Liberty's light is not just metaphorical: her dramatic illumination is a perfect example of American ingenuity and engineering. Since the statue's installation in New York Harbor in 1886, lighting engineers and designers had struggled to illuminate the 150-foot copper-clad monument in a manner becoming an American icon. It took the thoughtful and creative approach of Howard Brandston-a legend in his own right-to solve this lighting challenge. In 1984, the designer was asked to give the statue a much-needed lighting makeover in preparation for its centennial. In order to avoid the shortcomings of previous attempts, he studied the monument from every angle and in all lighting conditions, discovering that it looked best in the light of dawn. Brandston determined that he would need 'one lamp to mimic the morning sun and one lamp to mimic the morning sky.' Learning that no existing lamps could simulate these conditions, Brandston partnered with General Electric to develop two new metal halide products. With only a short time for R&D, a team of engineers at GE's Nela Park laboratories assembled a 'top secret' testing room dedicated to the Statue of Liberty project. After nearly two years of work to perfect the new lamps, the 'dawn's early light' effect was finally achieved just days before the centennial celebrations were to take place in 1986. 'It was truly a labor of love,' he recalls.
Resumo:
Diffusion in a composite slab consisting of a large number of layers provides an ideal prototype problem for developing and analysing two-scale modelling approaches for heterogeneous media. Numerous analytical techniques have been proposed for solving the transient diffusion equation in a one-dimensional composite slab consisting of an arbitrary number of layers. Most of these approaches, however, require the solution of a complex transcendental equation arising from a matrix determinant for the eigenvalues that is difficult to solve numerically for a large number of layers. To overcome this issue, in this paper, we present a semi-analytical method based on the Laplace transform and an orthogonal eigenfunction expansion. The proposed approach uses eigenvalues local to each layer that can be obtained either explicitly, or by solving simple transcendental equations. The semi-analytical solution is applicable to both perfect and imperfect contact at the interfaces between adjacent layers and either Dirichlet, Neumann or Robin boundary conditions at the ends of the slab. The solution approach is verified for several test cases and is shown to work well for a large number of layers. The work is concluded with an application to macroscopic modelling where the solution of a fine-scale multilayered medium consisting of two hundred layers is compared against an “up-scaled” variant of the same problem involving only ten layers.