892 resultados para Towards Seamless Integration of Geoscience Models and Data
Resumo:
Teachers’ emotional competences and well-being are fundamentally important to developing and maintaining positive relationships in the classroom, which can contribute to improving pedagogical action. References to several intervention programmes are found in the literature with the purpose of changing the practices, attitudes, and beliefs of teachers, who show evidence of a significant improvement in personal competences and school success. Therefore, an intervention with teachers integrating a broader line of research was carried out, involving parents and students as well. It consists of a programme which promotes personal (well-being and emotional intelligence) and professional (acquiring differentiated pedagogical strategies) competences over a period of six months, followed by a focus group to assess the contribution of an empowerment programme with the intention of promoting school success. The preliminary action-research study involved 10 teachers of two classes with students who show disruptive behaviour in the 7th year in a school in the central region of Portugal. The teachers, of both genders, are aged between 44 and 52, and belong to several recruitment groups. The main research question was: “To what extent does an intervention programme, intended for training, contribute to developing personal and professional competences in teachers of the 3rd cycle of basic education?” The teachers revealed a rather favourable view of their participation in the programme, considering that it helped them perceive some behaviours and practices which are less adjusted to their action in the classroom with these students (shouting, scolding, etc.). From the pretest to the posttest, statistically significant differences were found in assessing their own emotions and in their use. Signs of improvement in positive affections and satisfaction with life were also found, though with a marginal significance. The preliminary data in this empowerment programme for these educational agents points towards the importance of teachers’ awareness in what concerns their pedagogical action, as well as the need to change traditional pedagogical practices that contribute to discouraging students towards learning. The need to establish closer and systematic contact with the students and their families in order to meet their needs and expectations was also highlighted.
Resumo:
The uncertainty of the future of a firm has to be modelled and incorporated into the evaluation of companies outside their explicit period of analysis, i.e., in the continuing or terminal value considered within valuation models. However, there is a multiplicity of factors that influence the continuing value of businesses which are not currently being considered within valuation models. In fact, ignoring these factors may cause significant errors of judgment, which can lead models to values of goodwill or badwill, far from the substantial value of the inherent assets. Consequently, these results provided will be markedly different from market values. So, why not consider alternative models incorporating life expectancy of companies, as well as the influence of other attributes of the company in order to get a smoother adjustment between market price and valuation methods? This study aims to provide a contribution towards this area, having as its main objective the analysis of potential determinants of firm value in the long term. Using a sample of 714 listed companies, belonging to 15 European countries, and a panel data for the period between 1992 and 2011, our results show that continuing value cannot be regarded as the current value of a constant or growth perpetuity of a particular attribute of the company, but instead be according to a set of attributes such as free cash flow, net income, the average life expectancy of the company, investment in R&D, capabilities and quality of management, liquidity and financing structure.
Resumo:
The key functional operability in the pre-Lisbon PJCCM pillar of the EU is the exchange of intelligence and information amongst the law enforcement bodies of the EU. The twin issues of data protection and data security within what was the EU’s third pillar legal framework therefore come to the fore. With the Lisbon Treaty reform of the EU, and the increased role of the Commission in PJCCM policy areas, and the integration of the PJCCM provisions with what have traditionally been the pillar I activities of Frontex, the opportunity for streamlining the data protection and data security provisions of the law enforcement bodies of the post-Lisbon EU arises. This is recognised by the Commission in their drafting of an amending regulation for Frontex , when they say that they would prefer “to return to the question of personal data in the context of the overall strategy for information exchange to be presented later this year and also taking into account the reflection to be carried out on how to further develop cooperation between agencies in the justice and home affairs field as requested by the Stockholm programme.” The focus of the literature published on this topic, has for the most part, been on the data protection provisions in Pillar I, EC. While the focus of research has recently sifted to the previously Pillar III PJCCM provisions on data protection, a more focused analysis of the interlocking issues of data protection and data security needs to be made in the context of the law enforcement bodies, particularly with regard to those which were based in the pre-Lisbon third pillar. This paper will make a contribution to that debate, arguing that a review of both the data protection and security provision post-Lisbon is required, not only in order to reinforce individual rights, but also inter-agency operability in combating cross-border EU crime. The EC’s provisions on data protection, as enshrined by Directive 95/46/EC, do not apply to the legal frameworks covering developments within the third pillar of the EU. Even Council Framework Decision 2008/977/JHA, which is supposed to cover data protection provisions within PJCCM expressly states that its provisions do not apply to “Europol, Eurojust, the Schengen Information System (SIS)” or to the Customs Information System (CIS). In addition, the post Treaty of Prüm provisions covering the sharing of DNA profiles, dactyloscopic data and vehicle registration data pursuant to Council Decision 2008/615/JHA, are not to be covered by the provisions of the 2008 Framework Decision. As stated by Hijmans and Scirocco, the regime is “best defined as a patchwork of data protection regimes”, with “no legal framework which is stable and unequivocal, like Directive 95/46/EC in the First pillar”. Data security issues are also key to the sharing of data in organised crime or counterterrorism situations. This article will critically analyse the current legal framework for data protection and security within the third pillar of the EU.
Resumo:
Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.
Resumo:
2016
Resumo:
The high degree of variability and inconsistency in cash flow study usage by property professionals demands improvement in knowledge and processes. Until recently limited research was being undertaken on the use of cash flow studies in property valuations but the growing acceptance of this approach for major investment valuations has resulted in renewed interest in this topic. Studies on valuation variations identify data accuracy, model consistency and bias as major concerns. In cash flow studies there are practical problems with the input data and the consistency of the models. This study will refer to the recent literature and identify the major factors in model inconsistency and data selection. A detailed case study will be used to examine the effects of changes in structure and inputs. The key variable inputs will be identified and proposals developed to improve the selection process for these key variables. The variables will be selected with the aid of sensitivity studies and alternative ways of quantifying the key variables explained. The paper recommends, with reservations, the use of probability profiles of the variables and the incorporation of this data in simulation exercises. The use of Monte Carlo simulation is demonstrated and the factors influencing the structure of the probability distributions of the key variables are outline. This study relates to ongoing research into functional performance of commercial property within an Australian Cooperative Research Centre.
Resumo:
Views on the nature and relevance of science education have changed significantly over recent decades. This has serious implications for the way in which science is taught in secondary schools, particularly with respect to teaching emerging topics such as biotechnology, which have a socio-scientific dimension and also require novel laboratory skills. It is apparent in current literature that there is a lack of adequate teacher professional development opportunities in biotechnology education and that a significant need exists for researchers to develop a carefully crafted and well supported professional development design which will positively impact on the way in which teachers engage with contemporary science. This study used a retrospective case study methodology to document the recent evolution of modern biotechnology education as part of the changing nature of science education; examine the adoption and implementation processes for biotechnology education by three secondary schools; and to propose an evidence based biotechnology professional development model for science educators. Data were gathered from documents, one-on-one interviews and focus group discussions. Analysis of these data has led to the proposal of a biotechnology professional development model which considers all of the key components of science professional development that are outlined in the literature, as well as the additional components which were articulated by the educators studied. This research is timely and pertinent to the needs of contemporary science education because of its recognition of the need for a professional development model in biotechnology education that recognizes and addresses the content knowledge, practical skills, pedagogical knowledge and curriculum management components.
Resumo:
In daily activities people are using a number of available means for the achievement of balance, such as the use of hands and the co-ordination of balance. One of the approaches that explains this relationship between perception and action is the ecological theory that is based on the work of a) Bernstein (1967), who imposed the problem of ‘the degrees of freedom’, b) Gibson (1979), who referred to the theory of perception and the way which the information is received from the environment in order for a certain movement to be achieved, c) Newell (1986), who proposed that movement can derive from the interaction of the constraints that imposed from the environment and the organism and d) Kugler, Kelso and Turvey (1982), who showed the way which “the degrees of freedom” are connected and interact. According to the above mentioned theories, the development of movement co-ordination can result from the different constraints that imposed into the organism-environment system. The close relation between the environmental and organismic constraints, as well as their interaction is responsible for the movement system that will be activated. These constraints apart from shaping the co-ordination of specific movements can be a rate limiting factor, to a certain degree, in the acquisition and mastering of a new skill. This frame of work can be an essential tool for the study of catching an object (e.g., a ball). The importance of this study becomes obvious due to the fact that movements that involved in catching an object are representative of every day actions and characteristic of the interaction between perception and action.
Resumo:
PURPOSE: To introduce techniques for deriving a map that relates visual field locations to optic nerve head (ONH) sectors and to use the techniques to derive a map relating Medmont perimetric data to data from the Heidelberg Retinal Tomograph. METHODS: Spearman correlation coefficients were calculated relating each visual field location (Medmont M700) to rim area and volume measures for 10 degrees ONH sectors (HRT III software) for 57 participants: 34 with glaucoma, 18 with suspected glaucoma, and 5 with ocular hypertension. Correlations were constrained to be anatomically plausible with a computational model of the axon growth of retinal ganglion cells (Algorithm GROW). GROW generated a map relating field locations to sectors of the ONH. The sector with the maximum statistically significant (P < 0.05) correlation coefficient within 40 degrees of the angle predicted by GROW for each location was computed. Before correlation, both functional and structural data were normalized by either normative data or the fellow eye in each participant. RESULTS: The model of axon growth produced a 24-2 map that is qualitatively similar to existing maps derived from empiric data. When GROW was used in conjunction with normative data, 31% of field locations exhibited a statistically significant relationship. This significance increased to 67% (z-test, z = 4.84; P < 0.001) when both field and rim area data were normalized with the fellow eye. CONCLUSIONS: A computational model of axon growth and normalizing data by the fellow eye can assist in constructing an anatomically plausible map connecting visual field data and sectoral ONH data.
Resumo:
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.