909 resultados para Prolonged application times


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We review all journal articles based on “PSED-type” research, i.e., longitudinal, empirical studies of large probability samples of on-going, business start-up efforts. We conclude that the research stream has yielded interesting findings; sometimes by confirming prior research with a less bias-prone methodology and at other times by challenging whether prior conclusions are valid for the early stages of venture development. Most importantly, the research has addressed new, process-related research questions that prior research has shunned or been unable to study in a rigorous manner. The research has revealed an enormous and fascinating variability in new venture creation that also makes it challenging to arrive at broadly valid generalizations. An analysis of the findings across studies as well as an examination of those studies that have been relatively more successful at explaining outcomes give good guidance regarding what is required in order to achieve strong and credible results. We compile and present such advice to users of existing data sets and designers of new projects in the following areas: Statistically representative and/or theoretically relevant sampling; Level of analysis issues; Dealing with process heterogeneity; Dealing with other heterogeneity issues, and Choice and interpretation of dependent variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photo-curable biodegradable macromers were prepared by ring opening polymerization of D,L-lactide (DLLA), ε-caprolactone (CL) and 1,3-trimethylene carbonate (TMC) in the presence of glycerol or sorbitol as initiator and stannous octoate as catalyst, and subsequent methacrylation of the terminal hydroxyl groups. These methacrylated macromers, ranging in molecular weight from approximately 700 to 6000 g/mol, were cross-linked using ultraviolet (UV) light to form biodegradable networks. Homogeneous networks with high gel contents were prepared. One of the resins based on PTMC was used to prepare three-dimensional structures by stereo-lithography using a commercially available apparatus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The urban waterfront may be regarded as the littoral frontier of human settlement. Typically, over the years, it advances, sometimes retreats, where terrestrial and aquatic processes interact and frequently contest this margin of occupation. Because most towns and cities are sited beside water bodies, many of these urban centers on or close to the sea, their physical expansion is constrained by the existence of aquatic areas in one or more directions from the core. It is usually much easier for new urban development to occur along or inland from the waterfront. Where other physical constraints, such as rugged hills or mountains, make expansion difficult or expensive, building at greater densities or construction on steep slopes is a common response. This kind of development, though technically feasible, is usually more expensive than construction on level or gently sloping land, however. Moreover, there are many reasons for developing along the shore or riverfront in preference to using sites further inland. The high cost of developing existing dry land that presents serious construction difficulties is one reason for creating new land from adjacent areas that are permanently or periodically under water. Another reason is the relatively high value of artificially created land close to the urban centre when compared with the value of existing developable space at a greater distance inland. The creation of space for development is not the only motivation for urban expansion into aquatic areas. Commonly, urban places on the margins of the sea, estuaries, rivers or great lakes are, or were once, ports where shipping played an important role in the economy. The demand for deep waterfronts to allow ships to berth and for adjacent space to accommodate various port facilities has encouraged the advance of the urban land area across marginal shallows in ports around the world. The space and locational demands of port related industry and commerce, too, have contributed to this process. Often closely related to these developments is the generation of waste, including domestic refuse, unwanted industrial by-products, site formation and demolition debris and harbor dredgings. From ancient times, the foreshore has been used as a disposal area for waste from nearby settlements, a practice that continues on a huge scale today. Land formed in this way has long been used for urban development, despite problems that can arise from the nature of the dumped material and the way in which it is deposited. Disposal of waste material is a major factor in the creation of new urban land. Pollution of the foreshore and other water margin wetlands in this way encouraged the idea that the reclamation of these areas may be desirable on public health grounds. With reference to examples from various parts of the world, the historical development of the urban littoral frontier and its effects on the morphology and character of towns and cities are illustrated and discussed. The threat of rising sea levels and the heritage value of many waterfront areas are other considerations that are addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Globally, teaching has become more complex and more challenging over recent years, with new and increased demands being placed on teachers by students, their families, governments and wider society. Teachers work with more diverse communities in times characterised by volatility, uncertainty and moral ambiguity. Societal, political, economic and cultural shifts have transformed the contexts in which teachers work and have redefined the ways in which teachers interact with students. This qualitative study uses phenomenographic methods to explore the nature of pedagogic teacherstudent interactions. The data analysis reveals five qualitatively different ways in which teachers experience pedagogic engagements with students. The resultant categories of description ranged from information providing, with teachers viewed as transmitters of a body of knowledge through to mentoring in which teachers were perceived as significant others in the lives of students with their influence extending beyond the walls of the classroom and beyond the years of schooling. The paper concludes by arguing that if teachers are to prepare students for the challenges and opportunities in changing times, teacher education programs need to consider ways to facilitate the development of mentoring capacities in new teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create hypovigilance and impair performance towards critical events. Identifying such impairment in monotonous conditions has been a major subject of research, but no research to date has attempted to predict it in real-time. This pilot study aims to show that performance decrements due to monotonous tasks can be predicted through mathematical modelling taking into account sensation seeking levels. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants‟ performance. The framework for prediction developed on this task could be extended to a monotonous driving task. A Hidden Markov Model (HMM) is proposed to predict participants‟ lapses in alertness. Driver‟s vigilance evolution is modelled as a hidden state and is correlated to a surrogate measure: the participant‟s reactions time. This experiment shows that the monotony of the task can lead to an important decline in performance in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hazard perception in driving is the one of the few driving-specific skills associated with crash involvement. However, this relationship has only been examined in studies where the majority of individuals were younger than 65. We present the first data revealing an association between hazard perception and self-reported crash involvement in drivers aged 65 and over. In a sample of 271 drivers, we found that individuals whose mean response time to traffic hazards was slower than 6.68 seconds (the ROC-curve derived pass mark for the test) were 2.32 times (95% CI 1.46, 3.22) more likely to have been involved in a self-reported crash within the previous five years than those with faster response times. This likelihood ratio became 2.37 (95% CI 1.49, 3.28) when driving exposure was controlled for. As a comparison, individuals who failed a test of useful field of view were 2.70 (95% CI 1.44, 4.44) times more likely to crash than those who passed. The hazard perception test and the useful field of view measure accounted for separate variance in crash involvement. These findings indicate that hazard perception testing and training could be potentially useful for road safety interventions for this age group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on the development of specifications for an on-board mass monitoring (OBM) application for regulatory requirements in Australia. An earlier paper reported on feasibility study and pilot testing program prior to the specification development [1]. Learnings from the pilot were used to refine this testing process and a full scale testing program was conducted from July to October 2008. The results from the full scale test and evidentiary implications are presented in this report. The draft specification for an evidentiary on-board mass monitoring application is currently under development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The economiser is a critical component for efficient operation of coal-fired power stations. It consists of a large system of water-filled tubes which extract heat from the exhaust gases. When it fails, usually due to erosion causing a leak, the entire power station must be shut down to effect repairs. Not only are such repairs highly expensive, but the overall repair costs are significantly affected by fluctuations in electricity market prices, due to revenue lost during the outage. As a result, decisions about when to repair an economiser can alter the repair costs by millions of dollars. Therefore, economiser repair decisions are critical and must be optimised. However, making optimal repair decisions is difficult because economiser leaks are a type of interactive failure. If left unfixed, a leak in a tube can cause additional leaks in adjacent tubes which will need more time to repair. In addition, when choosing repair times, one also needs to consider a number of other uncertain inputs such as future electricity market prices and demands. Although many different decision models and methodologies have been developed, an effective decision-making method specifically for economiser repairs has yet to be defined. In this paper, we describe a Decision Tree based method to meet this need. An industrial case study is presented to demonstrate the application of our method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of bubble contraction in a Hele-Shaw cell is studied for the case in which the surrounding fluid is of power-law type. A small perturbation of the radially symmetric problem is first considered, focussing on the behaviour just before the bubble vanishes, it being found that for shear-thinning fluids the radially symmetric solution is stable, while for shear-thickening fluids the aspect ratio of the bubble boundary increases. The borderline (Newtonian) case considered previously is neutrally stable, the bubble boundary becoming elliptic in shape with the eccentricity of the ellipse depending on the initial data. Further light is shed on the bubble contraction problem by considering a long thin Hele-Shaw cell: for early times the leading-order behaviour is one-dimensional in this limit; however, as the bubble contracts its evolution is ultimately determined by the solution of a Wiener-Hopf problem, the transition between the long-thin limit and the extinction limit in which the bubble vanishes being described by what is in effect a similarity solution of the second kind. This same solution describes the generic (slit-like) extinction behaviour for shear-thickening fluids, the interface profiles that generalise the ellipses that characterise the Newtonian case being constructed by the Wiener-Hopf calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hong Kong is a modern global city with a reputation for well-regulated financial markets, but for years, the government had been trying to enact laws on corporate rescue procedures with relatively little success. It is under the pretext of the Global Financial Crisis, the threat of a future economic meltdown gave the Hong Kong government the impetus to revisit this issue. This third attempt to codify statutory obligations on directors’ liability for insolvent trading has been criticised for either setting the standards too high or low for directors trading whilst insolvent. There is also some reservation given the beliefs and values of directors in Chinese family-owned and controlled companies. These companies would most likely trade out the difficult times. Nevertheless, this does not negate from the fact that the enactment of corporate rescue procedures in Hong Kong in 2010 is a momentous achievement for the Hong Kong government.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To identify the challenges faced by local government in Indonesia when adopting a Public Asset Management Framework. Design A Case Study in South Sulawesi Provincial Government was used as the approach to achieving the research objective. The case study involved two data collection techniques - interviews and document analysis. Findings The result of the study indicates there are significant challenges that the Indonesian local government need to manage when adopting a public asset management framework. Those challenges are: absence of an institutional and legal framework to support the asset management application; non-profit principle of public assets; multiple jurisdictions involved in the public asset management processes; the complexity of local government objectives; unavailability of data for managing public property; and limited human resources. Research Limitation This research is limited to one case study. It is a preliminary study from larger research that uses multiple case studies. The main research also investigates opportunities for local government by adopting and implementing public asset management. Originality/Value Findings from this study provide useful input for the policy makers, academics and asset management practitioners in Indonesia to establish a public asset management framework resulting in efficient and effective organizations, as well as an increase of public services quality. This study has a potential application for other developing countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an experiment undertaken to investigate intuitive interaction, particularly in older adults. Previous work has shown that intuitive interaction relies on past experience, and has also suggested that older people demonstrate less intuitive uses and slower times when completing set tasks with various devices. Similarly, this experiment showed that past experience with relevant products allowed people to use the interfaces of two different microwaves more quickly and intuitively. It also revealed that certain aspects of cognitive decline related to aging, such as central executive function, have more impact on time, correct uses and intuitive uses than chronological age. Implications of these results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

xpanding human chondrocytes in vitro while maintaining their ability to form cartilage remains a key challenge in cartilage tissue engineering. One promising approach to address this is to use microcarriers as substrates for chondrocyte expansion. While microcarriers have shown beneficial effects for expansion of animal and ectopic human chondrocytes, their utility has not been determined for freshly isolated adult human articular chondrocytes. Thus, we investigated the proliferation and subsequent chondrogenic differentiation of these clinically relevant cells on porous gelatin microcarriers and compared them to those expanded using traditional monolayers. Chondrocytes attached to microcarriers within 2 days and remained viable over 4 weeks of culture in spinner flasks. Cells on microcarriers exhibited a spread morphology and initially proliferated faster than cells in monolayer culture, however, with prolonged expansion they were less proliferative. Cells expanded for 1 month and enzymatically released from microcarriers formed cartilaginous tissue in micromass pellet cultures, which was similar to tissue formed by monolayer-expanded cells. Cells left attached to microcarriers did not exhibit chondrogenic capacity. Culture conditions, such as microcarrier material, oxygen tension, and mechanical stimulation require further investigation to facilitate the efficient expansion of clinically relevant human articular chondrocytes that maintain chondrogenic potential for cartilage regeneration applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on the empirical comparison of seven machine learning algorithms in texture classification with application to vegetation management in power line corridors. Aiming at classifying tree species in power line corridors, object-based method is employed. Individual tree crowns are segmented as the basic classification units and three classic texture features are extracted as the input to the classification algorithms. Several widely used performance metrics are used to evaluate the classification algorithms. The experimental results demonstrate that the classification performance depends on the performance matrix, the characteristics of datasets and the feature used.