355 resultados para CUNY-wide IT steering committee
Resumo:
The challenges facing the Singapore education system in the new millennium are unique and unprecedented in Asia. Demands for new skills, knowledges, and flexible competencies for globalised economies and cosmopolitan cultures will require system-wide innovation and reform. But there is a dearth of international benchmarks and prototypes for such reforms. This paper describes the current Core Research Program underway at the National Institute of Education in Singapore, a multilevel analysis of Singaporean schooling, pedagogy, youth and educational outcomes. It describes student background, performance, classroom practices, student artefacts and outcomes, and student longitudinal life pathways. The case is made that a systematic focus on teachers' and students' work in everyday classroom contexts is the necessary starting point for pedagogical innovation and change. This, it is argued, can constitute a rich multidisciplinary evidence base for educational policy. (Contains 1 figure, 1 table and 3 notes.)
Resumo:
This thesis addresses the problem of detecting and describing the same scene points in different wide-angle images taken by the same camera at different viewpoints. This is a core competency of many vision-based localisation tasks including visual odometry and visual place recognition. Wide-angle cameras have a large field of view that can exceed a full hemisphere, and the images they produce contain severe radial distortion. When compared to traditional narrow field of view perspective cameras, more accurate estimates of camera egomotion can be found using the images obtained with wide-angle cameras. The ability to accurately estimate camera egomotion is a fundamental primitive of visual odometry, and this is one of the reasons for the increased popularity in the use of wide-angle cameras for this task. Their large field of view also enables them to capture images of the same regions in a scene taken at very different viewpoints, and this makes them suited for visual place recognition. However, the ability to estimate the camera egomotion and recognise the same scene in two different images is dependent on the ability to reliably detect and describe the same scene points, or ‘keypoints’, in the images. Most algorithms used for this purpose are designed almost exclusively for perspective images. Applying algorithms designed for perspective images directly to wide-angle images is problematic as no account is made for the image distortion. The primary contribution of this thesis is the development of two novel keypoint detectors, and a method of keypoint description, designed for wide-angle images. Both reformulate the Scale- Invariant Feature Transform (SIFT) as an image processing operation on the sphere. As the image captured by any central projection wide-angle camera can be mapped to the sphere, applying these variants to an image on the sphere enables keypoints to be detected in a manner that is invariant to image distortion. Each of the variants is required to find the scale-space representation of an image on the sphere, and they differ in the approaches they used to do this. Extensive experiments using real and synthetically generated wide-angle images are used to validate the two new keypoint detectors and the method of keypoint description. The best of these two new keypoint detectors is applied to vision based localisation tasks including visual odometry and visual place recognition using outdoor wide-angle image sequences. As part of this work, the effect of keypoint coordinate selection on the accuracy of egomotion estimates using the Direct Linear Transform (DLT) is investigated, and a simple weighting scheme is proposed which attempts to account for the uncertainty of keypoint positions during detection. A word reliability metric is also developed for use within a visual ‘bag of words’ approach to place recognition.
Resumo:
BACKGROUND: The standard treatment for a non-union of the hallux metatarsophalangeal joint fusion has been to revise the fusion. Revision fusion is technically more demanding, often involving bone grafting, more substantial fixation and prolonged period of immobilization postoperatively. We present data to suggest that removal of hardware and debridement alone is an alternative treatment option. ---------- MATERIALS AND METHODS: A case note review identified patients with a symptomatic non-union after hallux metatarsophalangeal joint (MTPJ) fusion. It is our practice to offer these patients revision fusion or removal of hardware and debridement. For the seven patients that chose hardware removal and were left with a pseudarthrosis, a matched control group was selected from patients who had had successful fusions. Three outcome scores were used. Hallux valgus and dorsiflexion angles were recorded.---------- RESULTS: One hundred thirty-nine hallux MTPJ arthrodeses were carried out. Fourteen non-unions were identified. The rate of non-union in males and following previous hallux MTPJ surgery was 19% and 24%, respectively. In females undergoing a primary MTPJ fusion, the rate was 2.4%. Twelve non-union patients were reviewed at 27 months (mean). Eleven patients had elected to undergo removal of hardware and debridement. Four patients with pseudarthrosis were unhappy with the results and proceeded to either revision fusion or MTPJ replacement. Seven non-union patients, who had removal of hardware alone, had outcome scores marginally worse compared to those with successful fusions.---------- CONCLUSION: Removal of hardware alone is a reasonable option to offer as a relatively minor procedure following a failed arthrodesis of the first MTPJ. This must be accepted on the proviso that in this study four out of 11 (36%) patients proceeded to a revision first MTPJ fusion or first MTPJ replacement. We also found that the rate of non-union in primary first MTPJ fusion was significantly higher in males and those patients who had undergone previous surgery.
Resumo:
The Government of the Hong Kong SAR sponsored a report investigating the Hong Kong construction inudstry and published the investigating committee's findings in 2001 (HK CIRC 2001). Since then the Provisional Construction Industry Coordination Board (PCICB), and its successor, the Construction Industry Council (CIC), also set up by the Government, has made progress with the necessary reforms. Now that seven years have passed, it is time for an independent evaluation of the impact of the CIRC initiative in order to assist the CIC and the Government decision-makers in refining the efforts to improve the industry's performance. This paper reports on the interim results of a study that seeks to provide such an evaluation.
Resumo:
Many of the costs associated with greenfield residential development are apparent and tangible. For example, regulatory fees, government taxes, acquisition costs, selling fees, commissions and others are all relatively easily identified since they represent actual costs incurred at a given point in time. However, identification of holding costs are not always immediately evident since by contrast they characteristically lack visibility. One reason for this is that, for the most part, they are typically assessed over time in an ever-changing environment. In addition, wide variations exist in development pipeline components: they are typically represented from anywhere between a two and over sixteen years time period - even if located within the same geographical region. Determination of the starting and end points, with regards holding cost computation, can also prove problematic. Furthermore, the choice between application of prevailing inflation, or interest rates, or a combination of both over time, adds further complexity. Although research is emerging in these areas, a review of the literature reveals attempts to identify holding cost components are limited. Their quantification (in terms of relative weight or proportionate cost to a development project) is even less apparent; in fact, the computation and methodology behind the calculation of holding costs varies widely and in some instances completely ignored. In addition, it may be demonstrated that ambiguities exists in terms of the inclusion of various elements of holding costs and assessment of their relative contribution. Yet their impact on housing affordability is widely acknowledged to be profound, with their quantification potentially maximising the opportunities for delivering affordable housing. This paper seeks to build on earlier investigations into those elements related to holding costs, providing theoretical modelling of the size of their impact - specifically on the end user. At this point the research is reliant upon quantitative data sets, however additional qualitative analysis (not included here) will be relevant to account for certain variations between expectations and actual outcomes achieved by developers. Although this research stops short of cross-referencing with a regional or international comparison study, an improved understanding of the relationship between holding costs, regulatory charges, and housing affordability results.
Resumo:
Using data from 2004 to 2008, we find that an audit committee is an important monitoring mechanism as audit committee independence, expertise and size are associated with reduced levels of abnormal accruals, our measure of earnings management. This study also attempts to discern when the monitoring role of the audit committee is more salient for the firm. We find that ownership concentration and the presence of government officials on the audit committee are important determinants of the negative association between audit committee characteristics and earnings management. In contrast, we find no significant associations between the audit committee and abnormal accruals for Chinese firms listed only on the Chinese domestic Stock Exchanges. The paper contributes to the corporate governance literature in a transitional economy. Identifying the role of audit committees of firms listed on markets other than the domicile market demonstrates the importance of considering the institutional setting in governance research.
Resumo:
Driver simulators provide safe conditions to assess driver behaviour and provide controlled and repeatable environments for study. They are a promising research tool in terms of both providing safety and experimentally well controlled environments. There are wide ranges of driver simulators, from laptops to advanced technologies which are controlled by several computers in a real car mounted on platforms with six degrees of freedom of movement. The applicability of simulator-based research in a particular study needs to be considered before starting the study, to determine whether the use of a simulator is actually appropriate for the research. Given the wide range of driver simulators and their uses, it is important to know beforehand how closely the results from a driver simulator match results found in the real word. Comparison between drivers’ performance under real road conditions and in particular simulators is a fundamental part of validation. The important question is whether the results obtained in a simulator mirror real world results. In this paper, the results of the most recently conducted research into validity of simulators is presented.
Resumo:
This paper attempts to determine whether the adoption of recommended corporate governance practices by Chinese firms is associated with less earnings management proxied by abnormal accruals. We examine the role of the audit committee and ownership concentration in preventing earnings management using Chinese firms listed in Hong Kong. The results of this preliminary analysis show that the frequency of audit committee meetings is associated with reduced levels of abnormal accruals, our measure of earnings management. We conclude that audit committee activity is an important factor in constraining the propensity of managers to engage in earnings management. In contrast, we find that the size of the audit committee is associated with increased levels of abnormal accruals and suggest that increasing the size of the audit committee creates information asymmetry between the audit committee and management that reduces the monitoring capacity of the audit committee. We do not find any association between audit committee independence, financial and industry experience, or ownership concentration and abnormal accruals.
A story worth telling : putting oral history and digital collections online in cultural institutions
Resumo:
Digital platforms in cultural institutions offer exciting opportunities for oral history and digital storytelling that can augment and enrich traditional collections. The way in which cultural institutions allow access to the public is changing dramatically, prompting substantial expansions of their oral history and digital story holdings. In Queensland, Australia, public libraries and museums are becoming innovative hubs of a wide assortment of collections that represent a cross-section of community groups and organisations through the integration of oral history and digital storytelling. The State Library of Queensland (SLQ) features digital stories online to encourage users to explore what the institution has in the catalogue through their website. Now SLQ also offers oral history interviews online, to introduce users to oral history and other components of their collections,- such as photographs and documents to current, as well as new users. This includes the various departments, Indigenous centres and regional libraries affiliated with SLQ statewide, who are often unable to access the materials held within, or even full information about, the collections available within the institution. There has been a growing demand for resources and services that help to satisfy community enthusiasm and promote engagement. Demand increases as public access to affordable digital media technologies increases, and as community or marginalised groups become interested in do it yourself (DIY) history; and SLQ encourages this. This paper draws on the oral history and digital story-based research undertaken by the Queensland University of Technology (QUT) for the State Library of Queensland including: the Apology Collection: The Prime Minister’s apology to Australia’s Indigenous Stolen Generation; Five Senses: regional Queensland artists; Gay history of Brisbane; and The Queensland Business Leaders Hall of Fame.
Resumo:
My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').
Resumo:
Safety at roadway intersections is of significant interest to transportation professionals due to the large number of intersections in transportation networks, the complexity of traffic movements at these locations that leads to large numbers of conflicts, and the wide variety of geometric and operational features that define them. A variety of collision types including head-on, sideswipe, rear-end, and angle crashes occur at intersections. While intersection crash totals may not reveal a site deficiency, over exposure of a specific crash type may reveal otherwise undetected deficiencies. Thus, there is a need to be able to model the expected frequency of crashes by collision type at intersections to enable the detection of problems and the implementation of effective design strategies and countermeasures. Statistically, it is important to consider modeling collision type frequencies simultaneously to account for the possibility of common unobserved factors affecting crash frequencies across crash types. In this paper, a simultaneous equations model of crash frequencies by collision type is developed and presented using crash data for rural intersections in Georgia. The model estimation results support the notion of the presence of significant common unobserved factors across crash types, although the impact of these factors on parameter estimates is found to be rather modest.
Resumo:
System analysis within the traction power system is vital to the design and operation of an electrified railway. Loads in traction power systems are often characterised by their mobility, wide range of power variations, regeneration and service dependence. In addition, the feeding systems may take different forms in AC electrified railways. Comprehensive system studies are usually carried out by computer simulation. A number of traction power simulators have been available and they allow calculation of electrical interaction among trains and deterministic solutions of the power network. In the paper, a different approach is presented to enable load-flow analysis on various feeding systems and service demands in AC railways by adopting probabilistic techniques. It is intended to provide a different viewpoint to the load condition. Simulation results are given to verify the probabilistic-load-flow models.
Resumo:
Eating is an essential everyday life activity that has fascinated, captivated and defined society since time began. We currently exist in a society where over-consumption of food is an established risk factor chronic disease, the rate of which is increasing alarmingly. 'Food literacy' is an emerging term used to describe what we, as individuals and as a community know and understand about food and how to use it to meet our need, and thus potentially support and empower citizens to make healthy food choices. What exactly the components of food literacy are and how they influence food choice are poorly defined and understood, but increasingly gaining interest among health professionals, policy makers, community workers, educators and members of the public. This paper will build the argument for why concepts of 'food literacy' need to extend beyond existing terms and measures used in the literature to describe the food skills and knowledge needed to make use of public health nutrition messages.
Resumo:
Background: The hedgehog signaling pathway is vital in early development, but then becomes dormant, except in some cancer tumours. Hedgehog inhibitors are being developed for potential use in cancer. Objectives/Methods: The objective of this evaluation is to review the initial clinical studies of the hedgehog inhibitor, GDC-0449, in subjects with cancer. Results: Phase I trials have shown that GDC-0449 has benefits in subjects with metastatic or locally advanced basal-cell carcinoma and in one subjects with medulloblastoma. GDC-0449 was well tolerated. Conclusions: Long term efficacy and safety studies of GDC-0449 in these conditions and other solid cancers are now underway. These clinical trials with GDC-0449, and trials with other hedgehog inhibitors, will reveal whether it is beneficial and safe to inhibit the hedgehog pathway, in a wide range of solid tumours or not.
Resumo:
Continuous biometric authentication schemes (CBAS) are built around the biometrics supplied by user behavioural characteristics and continuously check the identity of the user throughout the session. The current literature for CBAS primarily focuses on the accuracy of the system in order to reduce false alarms. However, these attempts do not consider various issues that might affect practicality in real world applications and continuous authentication scenarios. One of the main issues is that the presented CBAS are based on several samples of training data either of both intruder and valid users or only the valid users' profile. This means that historical profiles for either the legitimate users or possible attackers should be available or collected before prediction time. However, in some cases it is impractical to gain the biometric data of the user in advance (before detection time). Another issue is the variability of the behaviour of the user between the registered profile obtained during enrollment, and the profile from the testing phase. The aim of this paper is to identify the limitations in current CBAS in order to make them more practical for real world applications. Also, the paper discusses a new application for CBAS not requiring any training data either from intruders or from valid users.