868 resultados para Didactic-pedagogic requirements
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
Businesses need property in order to generate turnover and profits. If real estate owners are to be able to provide properties and related services that are desirable, it is crucial that they understand tenants’ requirements and preferences. Changes in the way businesses operate might well lead to an overall reduction in space requirements in all sectors. Faced with reductions in demand, landlords will find themselves in an increasingly competitive marketplace for tenants. Of the array of strategies available to landlords, what strategies should they employ for maximum effect? This paper examines what United Kingdom tenants want from commercial property (retail, industrial and office). The first part provides an analysis of data from several hundred interviews with occupiers of commercial properties owned by some of the largest UK real estate investment companies. Results are presented for each of the asset classes separately. The second part compares the findings with previous research and discusses the strategic implications for landlords.
Resumo:
Changes to client requirements are inevitable during construction. Industry discourse is concerned with minimizing and controlling changes. However, accounts of practices involved in making changes are rare. In response to calls for more research into working practices, an ethnographic study of a live hospital project was undertaken to explore how changes are made. A vignette of a meeting exploring the investigation of changes illustrates the issues. This represents an example from the ethnographic fieldwork, which produced many observations. There was a strong emphasis on using change management procedures contained within the contract to investigate changes, even when it was known that the change was not required. For the practitioners, this was a way of demonstrating best practice, transparent and accountable decision-making regarding changes. Hence, concerns for following procedures sometimes overshadowed considerations about whether or not a change was required to improve the functionality of the building. However, the procedures acted as boundary objects between the communities of practice involved on the project by coordinating the work of managing changes. Insights suggest how contract procedures facilitate and impede the making of changes, which can inform policy guidance and contract drafting.
Resumo:
Flexibility of information systems (IS) have been studied to improve the adaption in support of the business agility as the set of capabilities to compete more effectively and adapt to rapid changes in market conditions (Glossary of business agility terms, 2003). However, most of work on IS flexibility has been limited to systems architecture, ignoring the analysis of interoperability as a part of flexibility from the requirements. This paper reports a PhD project, which proposes an approach to develop IS with flexibility features, considering some challenges of flexibility in small and medium enterprises (SMEs) such as the lack of interoperability and the agility of their business. The motivation of this research are the high prices of IS in developing countries and the usefulness of organizational semiotics to support the analysis of requirements for IS. (Liu, 2005).
Resumo:
This paper describes an application of Social Network Analysis methods for identification of knowledge demands in public organisations. Affiliation networks established in a postgraduate programme were analysed. The course was executed in a distance education mode and its students worked on public agencies. Relations established among course participants were mediated through a virtual learning environment using Moodle. Data available in Moodle may be extracted using knowledge discovery in databases techniques. Potential degrees of closeness existing among different organisations and among researched subjects were assessed. This suggests how organisations could cooperate for knowledge management and also how to identify their common interests. The study points out that closeness among organisations and research topics may be assessed through affiliation networks. This opens up opportunities for applying knowledge management between organisations and creating communities of practice. Concepts of knowledge management and social network analysis provide the theoretical and methodological basis.
Resumo:
This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.
Resumo:
Cosmic shear requires high precision measurement of galaxy shapes in the presence of the observational point spread function (PSF) that smears out the image. The PSF must therefore be known for each galaxy to a high accuracy. However, for several reasons, the PSF is usually wavelength dependent; therefore, the differences between the spectral energy distribution of the observed objects introduce further complexity. In this paper, we investigate the effect of the wavelength dependence of the PSF, focusing on instruments in which the PSF size is dominated by the diffraction limit of the telescope and which use broad-band filters for shape measurement. We first calculate biases on cosmological parameter estimation from cosmic shear when the stellar PSF is used uncorrected. Using realistic galaxy and star spectral energy distributions and populations and a simple three-component circular PSF, we find that the colour dependence must be taken into account for the next generation of telescopes. We then consider two different methods for removing the effect: (i) the use of stars of the same colour as the galaxies and (ii) estimation of the galaxy spectral energy distribution using multiple colours and using a telescope model for the PSF. We find that both of these methods correct the effect to levels below the tolerances required for per cent level measurements of dark energy parameters. Comparison of the two methods favours the template-fitting method because its efficiency is less dependent on galaxy redshift than the broad-band colour method and takes full advantage of deeper photometry.
Resumo:
The process of host cell invasion by Trypanosoma cruzi depends on parasite energy. What source of energy is used for that event is not known. To address this and other questions related to T. cruzi energy requirements and cell invasion, we analyzed metacyclic trypomastigote forms of the phylogenetically distant CL and G strains. For both strains, the nutritional stress experienced by cells starved for 24, 36, or 48 h in phosphate-buffered saline reduced the ATP content and the ability of the parasite to invade HeLa cells proportionally to the starvation time. Inhibition of ATP production by treating parasites with rotenone plus antimycin A also diminished the infectivity. Nutrient depletion did not alter the expression of gp82, the surface molecule that mediates CL strain internalization, but increased the expression of gp90, the negative regulator of cell invasion, in the G strain. When L-proline was given to metacyclic forms starved for 36 h, the ATP levels were restored to those of nonstarved controls for both strains. Glucose had no such effect, although this carbohydrate and L-proline were transported in similar fashions. Recovery of infectivity promoted by L-proline treatment of starved parasites was restricted to the CL strain. The profile of restoration of ATP content and gp82-mediated invasion capacity by L-proline treatment of starved Y-strain parasites was similar to that of the CL strain, whereas the Dm28 and Dm30 strains, whose infectivity is downregulated by gp90, behaved like the G strain. L-Proline was also found to increase the ability of the CL strain to traverse a gastric mucin layer, a property important for the establishment of T. cruzi infection by the oral route. Efficient translocation of parasites through gastric mucin toward the target epithelial cells in the stomach mucosa is an essential requirement for subsequent cell invasion. By relying on these closely associated ATP-driven processes, the metacyclic trypomastigotes effectively accomplish their internalization.
Resumo:
The aim of task scheduling is to minimize the makespan of applications, exploiting the best possible way to use shared resources. Applications have requirements which call for customized environments for their execution. One way to provide such environments is to use virtualization on demand. This paper presents two schedulers based on integer linear programming which schedule virtual machines (VMs) in grid resources and tasks on these VMs. The schedulers differ from previous work by the joint scheduling of tasks and VMs and by considering the impact of the available bandwidth on the quality of the schedule. Experiments show the efficacy of the schedulers in scenarios with different network configurations.
Resumo:
Two-dimensional and 3D quantitative structure-activity relationships studies were performed on a series of diarylpyridines that acts as cannabinoid receptor ligands by means of hologram quantitative structure-activity relationships and comparative molecular field analysis methods. The quantitative structure-activity relationships models were built using a data set of 52 CB1 ligands that can be used as anti-obesity agents. Significant correlation coefficients (hologram quantitative structure-activity relationships: r 2 = 0.91, q 2 = 0.78; comparative molecular field analysis: r 2 = 0.98, q 2 = 0.77) were obtained, indicating the potential of these 2D and 3D models for untested compounds. The models were then used to predict the potency of an external test set, and the predicted (calculated) values are in good agreement with the experimental results. The final quantitative structure-activity relationships models, along with the information obtained from 2D contribution maps and 3D contour maps, obtained in this study are useful tools for the design of novel CB1 ligands with improved anti-obesity potency.
Resumo:
With the rapid advancement of the webtechnology, more and more educationalresources, including software applications forteaching/learning methods, are available acrossthe web, which enables learners to access thelearning materials and use various ways oflearning at any time and any place. Moreover,various web-based teaching/learning approacheshave been developed during the last decade toenhance the capability of both educators andlearners. Particularly, researchers from bothcomputer science and education are workingtogether, collaboratively focusing ondevelopment of pedagogically enablingtechnologies which are believed to improve theinfrastructure of education systems andprocesses, including curriculum developmentmodels, teaching/learning methods, managementof educational resources, systematic organizationof communication and dissemination ofknowledge and skills required by and adapted tousers. Despite of its fast development, however,there are still great gaps between learningintentions, organization of supporting resources,management of educational structures,knowledge points to be learned and interknowledgepoint relationships such as prerequisites,assessment of learning outcomes, andtechnical and pedagogic approaches. Moreconcretely, the issues have been widelyaddressed in literature include a) availability andusefulness of resources, b) smooth integration ofvarious resources and their presentation, c)learners’ requirements and supposed learningoutcomes, d) automation of learning process interms of its schedule and interaction, and e)customization of the resources and agilemanagement of the learning services for deliveryas well as necessary human interferences.Considering these problems and bearing in mindthe advanced web technology of which weshould make full use, in this report we willaddress the following two aspects of systematicarchitecture of learning/teaching systems: 1)learning objects – a semantic description andorganization of learning resources using the webservice models and methods, and 2) learningservices discovery and learning goals match foreducational coordination and learning serviceplanning.
Resumo:
During the development of system requirements, software system specifications are often inconsistent. Inconsistencies may arise for different reasons, for example, when multiple conflicting viewpoints are embodied in the specification, or when the specification itself is at a transient stage of evolution. These inconsistencies cannot always be resolved immediately. As a result, we argue that a formal framework for the analysis of evolving specifications should be able to tolerate inconsistency by allowing reasoning in the presence of inconsistency without trivialisation, and circumvent inconsistency by enabling impact analyses of potential changes to be carried out. This paper shows how clustered belief revision can help in this process. Clustered belief revision allows for the grouping of requirements with similar functionality into clusters and the assignment of priorities between them. By analysing the result of a cluster, an engineer can either choose to rectify problems in the specification or to postpone the changes until more information becomes available.