834 resultados para Collapsed objects and Supernovae
Resumo:
In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.
Resumo:
The polarization position-angle swings that have been measured in a number of BL Lacertae objects and highly variable quasars are interpreted in terms of shock waves which illuminate (by enhanced synchrotron radiation) successive transverse cross sections of a magnetized, relativistic jet. The jet is assumed to have a nonaxisymmetric magnetic field configuration of the type discussed in the companion paper on the equilibria of force-free jets. For a jet that is viewed at a small angle to the axis, the passage of a shock will give rise to an apparent rotation of the polarization position angle whose amplitude can be substantially larger than 180 deg. The effects of freely propagating shocks are compared with those of bow shocks which form in front of dense obstacles in the jet, and specific applications to 0727 - 115 and BL Lacertae are considered. In the case of 0727 - 115, it is pointed out that the nonuniformity of the swing rate and the apparent oscillations of the degree of polarization could be a consequence of relativistic aberration.
Scopophobia/Scopophilia: electric light and the anxiety of the gaze in postwar American architecture
Resumo:
In the years of reconstruction and economic boom that followed the Second World War, the domestic sphere encountered new expectations regarding social behaviour, modes of living, and forms of dwelling. This book brings together an international group of scholars from architecture, design, urban planning, and interior design to reappraise mid-twentieth century modern life, offering a timely reassessment of culture and the economic and political effects on civilian life. This collection contains essays that examine the material of art, objects, and spaces in the context of practices of dwelling over the long span of the postwar period. It asks what role material objects, interior spaces, and architecture played in quelling or fanning the anxieties of modernism’s ordinary denizens, and how this role informs their legacy today. Table of Contents [Book] Introduction Robin Schuldenfrei Part 1: Psychological Constructions: Anxiety of Isolation and Exposure 1. Taking Comfort in the Age of Anxiety: Eero Saarinen’s Womb Chair Cammie McAtee 2. The Future is Possibly Past: The Anxious Spaces of Gaetano Pesce Jane Pavitt 3. Scopophobia/Scopophilia: Electric Light and the Anxiety of the Gaze in American Postwar Domestic Architecture Margaret Petty Part 2: Ideological Objects: Design and Representation 4. The Allegory of the Socialist Lifestyle: The Czechoslovak Pavilion at the Brussels Expo, its Gold Medal and the Politburo Ana Miljacki 5. Assimilating Unease: Moholy-Nagy and the Wartime-Postwar Bauhaus in Chicago Robin Schuldenfrei 6. The Anxieties of Autonomy: Peter Eisenman from Cambridge to House VI Sean Keller Part 3: Societies of Consumers: Materialist Ideologies and Postwar Goods 7. "But a home is not a laboratory": The Anxieties of Designing for the Socialist Home in the German Democratic Republic 1950—1965 Katharina Pfützner 8. Architect-designed Interiors for a Culturally Progressive Upper-Middle Class: The Implicit Political Presence of Knoll International in Belgium Fredie Floré 9. Domestic Environment: Italian Neo-Avant-Garde Design and the Politics of Post-Materialism Mary Louise Lobsinger Part 4: Class Concerns and Conflict: Dwelling and Politics 10. Dirt and Disorder: Taste and Anxiety in the Working Class Home Christine Atha 11. Upper West Side Stories: Race, Liberalism, and Narratives of Urban Renewal in Postwar New York Jennifer Hock 12. Pawns or Prophets? Postwar Architects and Utopian Designs for Southern Italy Anne Parmly Toxey. Coda: From Homelessness to Homelessness David Crowley
Resumo:
Within the history of twentieth-century design, there are a number of well-known objects and stories that are invoked time and time again to capture a pivotal moment or summarize a much broader historical transition. For example, Marcel Breuer’s Model B3 chair is frequently used as a stand-in for the radical investigations of form and new industrial materials occurring at the Bauhaus in the mid-1920s. Similarly, Raymond Loewy’s streamlined pencil sharpener has become historical shorthand for the emergence of modern industrial design in the 1930s. And any discussion of the development of American postwar “organic design” seems incomplete without reference to Charles and Ray Eames’s molded plywood leg splint of 1942. Such objects and narratives are dear to historians of modern design. They are tangible, photogenic subjects that slot nicely into exhibitions, historical surveys, and coffee-table best sellers...
Resumo:
The object of this work is Hegel's Logic, which comprises the first third of his philosophical System that also includes the Philosophy of Nature and the Philosophy of Spirit. The work is divided into two parts, where the first part investigates Hegel s Logic in itself or without an explicit reference to rest of Hegel's System. It is argued in the first part that Hegel's Logic contains a methodology for constructing examples of basic ontological categories. The starting point on which this construction is based is a structure Hegel calls Nothing, which I argue to be identical with an empty situation, that is, a situation with no objects in it. Examples of further categories are constructed, firstly, by making previous structures objects of new situations. This rule makes it possible for Hegel to introduce examples of ontological structures that contain objects as constituents. Secondly, Hegel takes also the very constructions he uses as constituents of further structures: thus, he is able to exemplify ontological categories involving causal relations. The final result of Hegel's Logic should then be a model of Hegel s Logic itself, or at least of its basic methods. The second part of the work focuses on the relation of Hegel's Logic to the other parts of Hegel's System. My interpretation tries to avoid, firstly, the extreme of taking Hegel's System as a grand metaphysical attempt to deduce what exists through abstract thinking, and secondly, the extreme of seeing Hegel's System as mere diluted Kantianism or a second-order investigation of theories concerning objects instead of actual objects. I suggest a third manner of reading Hegel's System, based on extending the constructivism of Hegel's Logic to the whole of his philosophical System. According to this interpretation, transitions between parts of Hegel's System should not be understood as proofs of any sort, but as constructions of one structure or its model from another structure. Hence, these transitions involve at least, and especially within the Philosophy of Nature, modelling of one type of object or phenomenon through characteristics of an object or phenomenon of another type, and in the best case, and especially within the Philosophy of Spirit, transformations of an object or phenomenon of one type into an object or phenomenon of another type. Thus, the transitions and descriptions within Hegel's System concern actual objects and not mere theories, but they still involve no fallacious deductions.
Resumo:
The open development model of software production has been characterized as the future model of knowledge production and distributed work. Open development model refers to publicly available source code ensured by an open source license, and the extensive and varied distributed participation of volunteers enabled by the Internet. Contemporary spokesmen of open source communities and academics view open source development as a new form of volunteer work activity characterized by hacker ethic and bazaar governance . The development of the Linux operating system is perhaps the best know example of such an open source project. It started as an effort by a user-developer and grew quickly into a large project with hundreds of user-developer as contributors. However, in hybrids , in which firms participate in open source projects oriented towards end-users, it seems that most users do not write code. The OpenOffice.org project, initiated by Sun Microsystems, in this study represents such a project. In addition, the Finnish public sector ICT decision-making concerning open source use is studied. The purpose is to explore the assumptions, theories and myths related to the open development model by analysing the discursive construction of the OpenOffice.org community: its developers, users and management. The qualitative study aims at shedding light on the dynamics and challenges of community construction and maintenance, and related power relations in hybrid open source, by asking two main research questions: How is the structure and membership constellation of the community, specifically the relation between developers and users linguistically constructed in hybrid open development? What characterizes Internet-mediated virtual communities and how can they be defined? How do they differ from hierarchical forms of knowledge production on one hand and from traditional volunteer communities on the other? The study utilizes sociological, psychological and anthropological concepts of community for understanding the connection between the real and the imaginary in so-called virtual open source communities. Intermediary methodological and analytical concepts are borrowed from discourse and rhetorical theories. A discursive-rhetorical approach is offered as a methodological toolkit for studying texts and writing in Internet communities. The empirical chapters approach the problem of community and its membership from four complementary points of views. The data comprises mailing list discussion, personal interviews, web page writings, email exchanges, field notes and other historical documents. The four viewpoints are: 1) the community as conceived by volunteers 2) the individual contributor s attachment to the project 3) public sector organizations as users of open source 4) the community as articulated by the community manager. I arrive at four conclusions concerning my empirical studies (1-4) and two general conclusions (5-6). 1) Sun Microsystems and OpenOffice.org Groupware volunteers failed in developing necessary and sufficient open code and open dialogue to ensure collaboration thus splitting the Groupware community into volunteers we and the firm them . 2) Instead of separating intrinsic and extrinsic motivations, I find that volunteers unique patterns of motivations are tied to changing objects and personal histories prior and during participation in the OpenOffice.org Lingucomponent project. Rather than seeing volunteers as a unified community, they can be better understood as independent entrepreneurs in search of a collaborative community . The boundaries between work and hobby are blurred and shifting, thus questioning the usefulness of the concept of volunteer . 3) The public sector ICT discourse portrays a dilemma and tension between the freedom to choose, use and develop one s desktop in the spirit of open source on one hand and the striving for better desktop control and maintenance by IT staff and user advocates, on the other. The link between the global OpenOffice.org community and the local end-user practices are weak and mediated by the problematic IT staff-(end)user relationship. 4) Authoring community can be seen as a new hybrid open source community-type of managerial practice. The ambiguous concept of community is a powerful strategic tool for orienting towards multiple real and imaginary audiences as evidenced in the global membership rhetoric. 5) The changing and contradictory discourses of this study show a change in the conceptual system and developer-user relationship of the open development model. This change is characterized as a movement from hacker ethic and bazaar governance to more professionally and strategically regulated community. 6) Community is simultaneously real and imagined, and can be characterized as a runaway community . Discursive-action can be seen as a specific type of online open source engagement. Hierarchies and structures are created through discursive acts. Key words: Open Source Software, open development model, community, motivation, discourse, rhetoric, developer, user, end-user
Resumo:
A lightning strike in the neighborhood can induce significant currents in tall down conductors. Though the magnitude of induced current in this case is much smaller than that encountered during a direct strike, the probability of occurrence and the frequency content is higher. In view of this, appropriate knowledge on the characteristics of such induced currents is relevant for the scrutiny of the recorded currents and in the evaluation of interference to the electrical and electronic system in the vicinity. Previously, a study was carried out on characteristics of induced currents assuming ideal conditions, that there were no influencing objects in the vicinity of the down conductor and channel. However, some influencing conducting bodies will always be present, such as trees, electricity and communication towers, buildings, and other elevated objects that can affect the induced currents in a down conductor. The present work is carried out to understand the influence of nearby conducting objects on the characteristics of induced currents due to a strike to ground in the vicinity of a tall down conductor. For the study, an electromagnetic model is employed to model the down conductor, channel, and neighboring conducting objects, and Numerical Electromagnetic Code-2 is used for numerical field computations. Neighboring objects of different heights, of different shapes, and at different locations are considered. It is found that the neighboring objects have significant influence on the magnitude and nature of induced currents in a down conductor when the height of the nearby conducting object is comparable to that of the down conductor.
Resumo:
Mid-frequency active (MFA) sonar emits pulses of sound from an underwater transmitter to help determine the size, distance, and speed of objects. The sound waves bounce off objects and reflect back to underwater acoustic receivers as an echo. MFA sonar has been used since World War II, and the Navy indicates it is the only reliable way to track submarines, especially more recently designed submarines that operate more quietly, making them more difficult to detect. Scientists have asserted that sonar may harm certain marine mammals under certain conditions, especially beaked whales. Depending on the exposure, they believe that sonar may damage the ears of the mammals, causing hemorrhaging and/or disorientation. The Navy agrees that the sonar may harm some marine mammals, but says it has taken protective measures so that animals are not harmed. MFA training must comply with a variety of environmental laws, unless an exemption is granted by the appropriate authority. Marine mammals are protected under the Marine Mammal Protection Act (MMPA) and some under the Endangered Species Act (ESA). The training program must also comply with the National Environmental Policy Act (NEPA), and in some cases the Coastal Zone Management Act (CZMA). Each of these laws provides some exemption for certain federal actions. The Navy has invoked all of the exemptions to continue its sonar training exercises. Litigation challenging the MFA training off the coast of Southern California ended with a November 2008 U.S. Supreme Court decision. The Supreme Court said that the lower court had improperly favored the possibility of injuring marine animals over the importance of military readiness. The Supreme Court’s ruling allowed the training to continue without the limitations imposed on it by other courts. (pdf contains 20pp.)
Resumo:
Mid-frequency active (MFA) sonar emits pulses of sound from an underwater transmitter to help determine the size, distance, and speed of objects. The sound waves bounce off objects and reflect back to underwater acoustic receivers as an echo. MFA sonar has been used since World War II, and the Navy indicates it is the only reliable way to track submarines, especially more recently designed submarines that operate more quietly, making them more difficult to detect. Scientists have asserted that sonar may harm certain marine mammals under certain conditions, especially beaked whales. Depending on the exposure, they believe that sonar may damage the ears of the mammals, causing hemorrhaging and/or disorientation. The Navy agrees that the sonar may harm some marine mammals, but says it has taken protective measures so that animals are not harmed. (PDF contains 20 pages)
Resumo:
My thesis studies how people pay attention to other people and the environment. How does the brain figure out what is important and what are the neural mechanisms underlying attention? What is special about salient social cues compared to salient non-social cues? In Chapter I, I review social cues that attract attention, with an emphasis on the neurobiology of these social cues. I also review neurological and psychiatric links: the relationship between saliency, the amygdala and autism. The first empirical chapter then begins by noting that people constantly move in the environment. In Chapter II, I study the spatial cues that attract attention during locomotion using a cued speeded discrimination task. I found that when the motion was expansive, attention was attracted towards the singular point of the optic flow (the focus of expansion, FOE) in a sustained fashion. The more ecologically valid the motion features became (e.g., temporal expansion of each object, spatial depth structure implied by distribution of the size of the objects), the stronger the attentional effects. However, compared to inanimate objects and cues, people preferentially attend to animals and faces, a process in which the amygdala is thought to play an important role. To directly compare social cues and non-social cues in the same experiment and investigate the neural structures processing social cues, in Chapter III, I employ a change detection task and test four rare patients with bilateral amygdala lesions. All four amygdala patients showed a normal pattern of reliably faster and more accurate detection of animate stimuli, suggesting that advantageous processing of social cues can be preserved even without the amygdala, a key structure of the “social brain”. People not only attend to faces, but also pay attention to others’ facial emotions and analyze faces in great detail. Humans have a dedicated system for processing faces and the amygdala has long been associated with a key role in recognizing facial emotions. In Chapter IV, I study the neural mechanisms of emotion perception and find that single neurons in the human amygdala are selective for subjective judgment of others’ emotions. Lastly, people typically pay special attention to faces and people, but people with autism spectrum disorders (ASD) might not. To further study social attention and explore possible deficits of social attention in autism, in Chapter V, I employ a visual search task and show that people with ASD have reduced attention, especially social attention, to target-congruent objects in the search array. This deficit cannot be explained by low-level visual properties of the stimuli and is independent of the amygdala, but it is dependent on task demands. Overall, through visual psychophysics with concurrent eye-tracking, my thesis found and analyzed socially salient cues and compared social vs. non-social cues and healthy vs. clinical populations. Neural mechanisms underlying social saliency were elucidated through electrophysiology and lesion studies. I finally propose further research questions based on the findings in my thesis and introduce my follow-up studies and preliminary results beyond the scope of this thesis in the very last section, Future Directions.
Resumo:
During the VITAL cruise in the Bay of Biscay in summer 2002, two devices for measuring the length of swimming fish were tested: 1) a mechanical crown that emitted a pair of parallel laser beams and that was mounted on the main camera and 2) an underwater auto-focus video camera. The precision and accuracy of these devices were compared and the various sources of measurement errors were estimated by repeatedly measuring fixed and mobile objects and live fish. It was found that fish mobility is the main source of error for these devices because they require that the objects to be measured are perpendicular to the field of vision. The best performance was obtained with the laser method where a video-replay of laser spots (projected on fish bodies) carrying real-time size information was used. The auto-focus system performed poorly because of a delay in obtaining focus and because of some technical problems.
Resumo:
On a daily basis, humans interact with a vast range of objects and tools. A class of tasks, which can pose a serious challenge to our motor skills, are those that involve manipulating objects with internal degrees of freedom, such as when folding laundry or using a lasso. Here, we use the framework of optimal feedback control to make predictions of how humans should interact with such objects. We confirm the predictions experimentally in a two-dimensional object manipulation task, in which subjects learned to control six different objects with complex dynamics. We show that the non-intuitive behavior observed when controlling objects with internal degrees of freedom can be accounted for by a simple cost function representing a trade-off between effort and accuracy. In addition to using a simple linear, point-mass optimal control model, we also used an optimal control model, which considers the non-linear dynamics of the human arm. We find that the more realistic optimal control model captures aspects of the data that cannot be accounted for by the linear model or other previous theories of motor control. The results suggest that our everyday interactions with objects can be understood by optimality principles and advocate the use of more realistic optimal control models for the study of human motor neuroscience.
Resumo:
Humans skillfully manipulate objects and tools despite the inherent instability. In order to succeed at these tasks, the sensorimotor control system must build an internal representation of both the force and mechanical impedance. As it is not practical to either learn or store motor commands for every possible future action, the sensorimotor control system generalizes a control strategy for a range of movements based on learning performed over a set of movements. Here, we introduce a computational model for this learning and generalization, which specifies how to learn feedforward muscle activity in a function of the state space. Specifically, by incorporating co-activation as a function of error into the feedback command, we are able to derive an algorithm from a gradient descent minimization of motion error and effort, subject to maintaining a stability margin. This algorithm can be used to learn to coordinate any of a variety of motor primitives such as force fields, muscle synergies, physical models or artificial neural networks. This model for human learning and generalization is able to adapt to both stable and unstable dynamics, and provides a controller for generating efficient adaptive motor behavior in robots. Simulation results exhibit predictions consistent with all experiments on learning of novel dynamics requiring adaptation of force and impedance, and enable us to re-examine some of the previous interpretations of experiments on generalization. © 2012 Kadiallah et al.
Resumo:
We present algorithms for tracking and reasoning of local traits in the subsystem level based on the observed emergent behavior of multiple coordinated groups in potentially cluttered environments. Our proposed Bayesian inference schemes, which are primarily based on (Markov chain) Monte Carlo sequential methods, include: 1) an evolving network-based multiple object tracking algorithm that is capable of categorizing objects into groups, 2) a multiple cluster tracking algorithm for dealing with prohibitively large number of objects, and 3) a causality inference framework for identifying dominant agents based exclusively on their observed trajectories.We use these as building blocks for developing a unified tracking and behavioral reasoning paradigm. Both synthetic and realistic examples are provided for demonstrating the derived concepts. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
We present Random Partition Kernels, a new class of kernels derived by demonstrating a natural connection between random partitions of objects and kernels between those objects. We show how the construction can be used to create kernels from methods that would not normally be viewed as random partitions, such as Random Forest. To demonstrate the potential of this method, we propose two new kernels, the Random Forest Kernel and the Fast Cluster Kernel, and show that these kernels consistently outperform standard kernels on problems involving real-world datasets. Finally, we show how the form of these kernels lend themselves to a natural approximation that is appropriate for certain big data problems, allowing $O(N)$ inference in methods such as Gaussian Processes, Support Vector Machines and Kernel PCA.