852 resultados para San Francisco
Resumo:
Health reform practices in Canada and elsewhere have restructured the purpose and use of diagnostic labels and the processes of naming such labels. Diagnoses are no longer only a means to tell doctors and patients what may be wrong and indicate potential courses of treatment; some diagnoses activate specialized services and supports for persons with a disability and those who provide care for them. In British Columbia, a standardized process of diagnosis with the outcome of an autism spectrum disorder gives access to government provided health care and educational services and supports. Such processes enter individuals into a complex of text mediated relations, regulated by the principles of evidence-based medicine. However, the diagnosis of autism in children is notoriously uncertain. Because of this ambiguity, standardizing the diagnostic process creates a hurdle in gaining help and support for parents who have children with problems that could lead to a diagnosis on the autism spectrum. Such processes and their organizing relations are problematized, explored and explicated below. Grounded in the epistemological and ontological shift offered by Dorothy E. Smith (1987; 1990a; 1999; 2005), this article reports on the findings of an institutional ethnographic study that explored the diagnostic process of autism in British Columbia. More specifically, this article focuses on the processes involved in going from mothers talking from their experience about their childrens problems to the formalized and standardized, and thus “virtually” produced, diagnoses that may or may not give access to services and supports in different systems of care. Two psychologists, a developmental pediatrician, a social worker – members of a specialized multidisciplinary assessment team – and several mothers of children with a diagnosis on the autism spectrum were interviewed. The implications of standardizing the diagnosis process of a disability that is not clear-cut and has funding attached are discussed. This ethnography also provides a glimpse of the implications of current and ongoing reforms in the state-supported health care system in British Columbia, and more generally in Canada, for people’s everyday doings.
Resumo:
Inland waters are of global biogeochemical importance. They receive carbon inputs of ~ 4.8 Pg C/ y of which, 12 % is buried, 18 % transported to the oceans, and 70 % supports aquatic secondary production. However, the mechanisms that determine the fate of organic matter (OM) in these systems are poorly defined. One aspect of this is the formation of organo-mineral complexes in aquatic systems and their potential as a route for OM transport and burial vs. their use as carbon (C) and nitrogen (N) sources within aquatic systems. Organo-mineral particles form by sorption of dissolved OM to freshly eroded mineral surfaces and may contribute to ecosystem-scale particulate OM fluxes. We experimentally tested the availability of mineral-sorbed OM as a C & N source for streamwater microbial assemblages and streambed biofilms. Organo-mineral particles were constructed in vitro by sorption of 13C:15N-labelled amino acids to hydrated kaolin particles, and microbial degradation of these particles compared with equivalent doses of 13C:15N-labelled free amino acids. Experiments were conducted in 120 ml mesocosms over 7 days using biofilms and water sampled from the Oberer Seebach stream (Austria). Each incubation experienced a 16:8 light:dark regime, with metabolism monitored via changes in oxygen concentrations between photoperiods. The relative fate of the organo-mineral particles was quantified by tracing the mineralization of the 13C and 15N labels and their incorporation into microbial biomass. Here we present the initial results of 13C-label mineralization, incorporation and retention within dissolved organic carbon pool. The results indicate that 514 (± 219) μmol/ mmol of the 13:15N labeled free amino acids were mineralized over the 7-day incubations. By contrast, 186 (± 97) μmol/ mmol of the mineral-sorbed amino acids were mineralized over a similar period. Thus, organo-mineral complexation reduced amino acid mineralization by ~ 60 %, with no differences observed between the streamwater and biofilm assemblages. Throughout the incubations, biofilms were observed to leach dissolved organic carbon (DOC). However, within the streamwater assemblage the presence of both organo-mineral particles and kaolin particles was associated with significant DOC removal (-1.7 % and -7.5 % respectively). Consequently, the study demonstrates that mineral and organo-mineral particles can limit the availability of DOC in aquatic systems, providing nucleation sites for flocculation and fresh mineral surfaces, which facilitate OM-sorption. The formation of these organo-mineral particles subsequently restricts microbial OM degradation, potentially altering the transport and facilitating the burial of OM within streams.
Resumo:
Most models of riverine eco-hydrology and biogeochemistry rely upon bulk parameterization of fluxes. However, the transport and retention of carbon and nutrients in headwater streams is strongly influenced by biofilms (surface-attached microbial communities), which results in strong feedbacks between stream hydrodynamics and biogeochemistry. Mechanistic understanding of the interactions between streambed biofilms and nutrient dynamics is lacking. Here we present experimental results linking microscale observations of biofilm community structure to the deposition and resuspension of clay-sized mineral particles in streams. Biofilms were grown in identical 3 m recirculating flumes over periods of 14-50 days. Fluorescent particles were introduced to each flume, and their deposition was traced over 30 minutes. Particle resuspension from the biofilms was then observed under an increased stream flow, mimicking a flood event. We quantified particle fluxes using flow cytometry and epifluorescence microscopy. We directly observed particle adhesion to the biofilm using a confocal laser scanning microscope. 3-D Optical Coherence Tomography was used to determine biofilm roughness, areal coverage and void space in each flume. These measurements allow us to link biofilm complexity to particle retention during both baseflow and floodflow. The results suggest that increased biofilm complexity favors deposition and retention of fine particles in streams.
Resumo:
Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.
Resumo:
This study investigates the re-employment hazard of displaced German workers using the first fourteen sweeps of the German Socio-Economic Panel (GSOEP) data. As well as parametric and non-parametric discrete-time specifications for the baseline hazard, the study employs alternative mixing distributions to account for unobserved heterogeneity. Findings of the study suggest negative duration dependence, even after accounting for unobserved heterogeneity. In terms of covariate effects, those at the lower end of the skills ladder, those who had been working in manufacturing and those with previous experience of non-employment are found to have lower hazard of exit via reemployment.
Resumo:
E-poltergeist takes over the user’s internet browser, automatically initiating Web searches without their permission. Web-based artwork which explores issues of user control when confronted with complex technological systems, questioning the limits of digital interactive arts as consensual reciprocal systems. e-poltergeist was a major web commission that marked an early stage of research in a larger enquiry by Craighead and Thomson into the relationship between live virtual data, global communications networks and instruction-based art, exploring how such systems can be re-contextualised within gallery environments. e-poltergeist presented the 'viewer' with a singular narrative by using live internet search-engine data that aimed to create a perpetual and virtually unstoppable cycle of search engine results, banner ads and moving windows as an interruption into the normal use of an internet browser. The work also addressed the ‘de-personalisation’ of internet use by sending a series of messages from the live search engine data that seemed to address the user directly: 'Is anyone there?'; 'Can anyone hear me?', 'Please help me!'; 'Nobody cares!' e-poltergeist makes a significant contribution to the taxonomy of new media art by dealing with the way that new media art can re-address notions of existing traditions in art such as appropriation and manipulation, instruction-based art and conceptual art. e-poltergeist was commissioned ($12,000) for 010101: Art in Technological Times, a landmark international exhibition presented by the San Francisco Museum of Modern Art, which bought together leading international practitioners working with emergent technologies, including Tatsuo Miyajima, Janet Cardiff, Brian Eno. Peer recognition of the project in the form of reviews include: Curating New Media. Gateshead: Baltic Centre for Contemporary Art. Cook, Sarah, Beryl Graham and Sarah Martin ISBN: 1093655064; The Wire; http://www.wired.com/culture/lifestyle/news/2000/12/40464 (review by Reena Jana); Leonardo (review Barbara Lee Williams and Sonya Rapoport) http://www.leonardo.info/reviews/feb2001/ex_010101_willrapop.html All the work is developed jointly and equally between Craighead and her collaborator, Jon Thomson, Slade School of Fine Art.
Resumo:
Christoph Franz of Lufthansa recently identified Ryanair, easyJet, Air Berlin and Emirates as the company’s main competitors – gone are the days when it could benchmark itself against BA or Air France-KLM! This paper probes behind the headlines to assess the extent to which different airlines are in competition, using evidence from the UK and mainland European markets. The issue of route versus network competition is addressed. Many regulators have put an emphasis on the former whereas the latter, although less obvious, can be more relevant. For example, BA and American will cease to compete between London and Dallas Fort Worth if their alliance obtains anti-trust immunity but 80% of the passengers on this route are connecting at one or both ends and hence arguably belong to different markets (e.g. London-San Francisco, Zurich-Dallas, Edinburgh-New Orleans) which may be highly contested. The remaining 20% of local traffic is actually insufficient to support a single point to point service in its own right. Estimates are made of the seat capacity major airlines are offering to the local market as distinct from feeding other routes. On a sector such as Manchester–Amsterdam, 60% of KLM’s passengers are transferring at Schiphol as against only 1% of bmibaby’s. Thus although KLM operates 5 flights and 630 seats per day against bmibaby’s 2 flights and 298 seats, in the point to point market bmibaby offers more seats than KLM. The growth of the Low Cost Carriers (LCCs) means that competition increasingly needs to be viewed on city pair markets (e.g. London-Rome) rather than airport pair markets (e.g. Heathrow-Fiumicino). As the stronger LCCs drive out weaker rivals and mainline carriers retrench to their major hubs, some markets now have fewer direct options than existed prior to the low cost boom. Timings and frequencies are considered, in particular the extent to which services are a true alternative especially for business travellers. LCCs typically offer lower frequencies and more unsociable timings (e.g. late evening arrivals at remote airports) as they are more focused on providing the cheapest service rather than the most convenient schedule. Interesting findings on ‘monopoly’ services are presented (including alliances) - certain airlines have many more of these than others. Lufthansa has a significant number of sectors to itself whereas at the other extreme British Airways has direct competition on almost every route in its network. Ryanair and flybe have a higher proportion of monopoly routes than easyJet or Air Berlin. In the domestic US market it has become apparent since deregulation that better financial returns can come from dominating a large number of smaller markets rather than being heavily exposed in the major markets - which are hotly fought over. Regional niches that appear too thin for Ryanair to serve (with its all 189 seat 737-800 fleet) are identified. Fare comparisons in contrasting markets provide some insights to marketing and pricing strategies. Data sources used include OAG (schedules and capacity), AEA (traditional European airlines traffic by region), the UK CAA (airport, airline and route traffic plus survey information of passenger types) and ICAO (international route traffic and capacity by carrier). It is concluded that airlines often have different competitors depending on the context but in surprisingly many cases there are actually few or no direct substitutes. The competitive process set in train by deregulation of European air services in the 1990s is leading back to one of natural monopolies and oblique alternatives. It is the names of the main participants that have changed however!
Resumo:
An evaluation of the change in perceived image contrast with changes in displayed image size was carried out. This was achieved using data from four psychophysical investigations, which employed techniques to match the perceived contrast of displayed images of five different sizes. A total of twenty-four S-shape polynomial functions were created and applied to every original test image to produce images with different contrast levels. The objective contrast related to each function was evaluated from the gradient of the mid-section of the curve (gamma). The manipulation technique took into account published gamma differences that produced a just-noticeable-difference (JND) in perceived contrast. The filters were designed to achieve approximately half a JND, whilst keeping the mean image luminance unaltered. The processed images were then used as test series in a contrast matching experiment. Sixty-four natural scenes, with varying scene content acquired under various illumination conditions, were selected from a larger set captured for the purpose. Results showed that the degree of change in contrast between images of different sizes varied with scene content but was not as important as equivalent perceived changes in sharpness.
Resumo:
In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.