Sunday, October 12, 2014

The Graphics Card



video card (also called a video adapterdisplay cardgraphics cardgraphics boarddisplay adaptergraphics adapter or frame buffer and sometimes preceded by the word discrete or dedicated to emphasize the distinction between this implementation and integrated graphics) is an expansion card which generates a feed of output images to a display (such as a computer monitor). Within the industry, video cards are sometimes called graphics add-in-boards, abbreviated as AIBs,  with the word "graphics" usually omitted.


Dedicated vs integrated graphics

As an alternative to the use of a video card, video hardware can be integrated into the motherboard or the CPU. Both approaches can be called integrated graphics. Motherboard-based implementations are sometimes called "on-board video" while CPU-based implementations are called accelerated processing units (APUs). Almost all desktop computer motherboards with integrated graphics allow the disabling of the integrated graphics chip in BIOS, and have a PCI, or PCI Express (PCI-E)slot for adding a higher-performance graphics card in place of the integrated graphics. The ability to disable the integrated graphics sometimes also allows the continued use of a motherboard on which the on-board video has failed. Sometimes both the integrated graphics and a dedicated graphics card can be used simultaneously to feed separate displays. The main advantages of integrated graphics include cost, compactness, simplicity and low energy consumption. The performance disadvantage of integrated graphics arises because the graphics processor shares system resources with the CPU. A dedicated graphics card has its own random access memory (RAM), its own cooling system, and dedicated power regulators, with all components designed specifically for processing video images. Upgrading to a dedicated graphics card offloads work from the CPU and system RAM, so not only will graphics processing be faster, but the computer's overall performance may also improve.
Both of the dominant CPU makers, AMD and Intel, are moving to APUs. One of the reasons is that graphics processors are powerful parallel processors, and placing them on the CPU die allows their parallel processing ability to be harnessed for various computing tasks in addition to graphics processing. (See Heterogeneous System Architecture, which discusses AMD's implementation.) APUs are the newer integrated graphics technology and, as costs decline, will probably be used instead of integrated graphics on the motherboard in most future low and mid-priced home and business computers. As of late 2013, the best APUs provide graphics processing approaching mid-range mobile video cards and are adequate for casual gaming. Users seeking the highest video performance for gaming or other graphics-intensive uses should still choose computers with dedicated graphics cards. (See Size of market and impact of accelerated processing units on video card sales, below.)
Beyond the enthusiast segment is the market for professional video cards for workstations used in the special effects industry, and in fields such as design, analysis and scientific research. Nvidia is a major player in the professional segment. In November, 2013, AMD introduced a so-called "Supercomputing" graphics card "designed for data visualization in finance, oil exploration, aeronautics and automotive, design and engineering, geophysics, life sciences, medicine and defense."

The Microchip



By definition the integrated circuit aka microchip is a set of interconnected electronic components such as transistors and resistors, that are etched or imprinted on a onto a tiny chip of a semiconducting material, such as silicon or germanium. The integrated circuit otherwise known as "The Chip" or microchip was invented by both Jack Kilby and Robert Noyce.

How Microchips Are Made

Microchips are built layer by layer on a wafer of the semiconductor material silicon. The layers are built by a process called photolithography involving chemicals, gases, and light.
First a layer of silicon dioxide is deposited on the surface of the silicon wafer, that layer is covered with a photosensitive chemical called a photoresist.
The photoresist is exposed to ultraviolet light shined through a pattern, which only hardens the areas exposed to the light. Gas is used to etch into the remaining soft areas. This process repeated and modified builds the component circuitry.
Conducting paths between the components are created by overlaying the chip with a thin layer of metal (aluminum). The photolithography and etching processes are used to remove the metal leaving only the conducting pathways.
The websites and videos below describe the process in detail.
Importance Of Technology




Importance of Technology in Our Lives Today
To understand and explore the importance of technology in our daily lives, let us first start by defining the term ‘Technology’. Technology refers to the use of tools, gadgets and resources that help us control and adapt to our environment. Originating from the Greek work ‘technologia’, the term also refers to the use of machines and utensils which make our daily lives simpler and more organized.importance of technology
Modern technology can be considered a direct consequence of the innovations and studies in the fields of science and engineering.  There is no doubt that technology has greatly influenced modern society and lifestyles. While most of these influences have been positive, there are few negative ones. But that is not the topic of this article. Here we shall only focus on the positive impact and significance of technology in the fields of education, business and health care.



Importance of Technology in Education
 Technology plays a significant role in the educational field. Math, reading and writing skills can all be improved using technological advances. Students do not have to bear with mundane learning cycles anymore as more and more teachers are gravitating towards the use of interactive tools and media for making learning interesting. Students can prepare for their future using educational materials that are easily available, thanks to information technology. Teachers are able to deliver content effortlessly to students and can also research complex subjects in the classrooms itself.
It is not just teachers and students who benefit from technology but also educational administrators and parents. The benefits of technology in this field can be summarized as below:
  1.   Technology encourages learning in a positive manner
  2.   It improves students’ skill sets
  3.   It helps prepare the future workforce.
Examples of technological use in the field of education include: use of spreadsheets for math and other topics, videoconferencing for distant learning, creation of web pages to display and share student work, Internet searches for exploring complex topics etc.
For technology to have positive impact in the field of education, teachers must ensure setting clear goals: else technology can be misused. Teachers also need to be trained and this education must be on-going. Technical support also needs to be provided in order to ensure correct handling of tools.
Importance of Technology in Business
Technology has also helped small businesses evolve and expand quickly. The use of social networking, video conferencing, virtual office tools and other such techniques have removed all boundaries, which, in the past, prevented growth. Thanks to technology; businesses, small and large, can reach a wider customer base and grow and expand. 
Business technology has helped improve communication. Today, workers are not limited to phone calls alone; they can send emails and messages without the fear of interrupting the recipient. Mobile technology has also helped workers communicate ‘on the go’. Information is not limited to one or two channels, but multiple and faster ones.
In general, the efficiency of the workforce has also increased. Employers are able to screen, recruit and hire potential candidates quickly; they are also able to inform about vacancies to larger number of applicants. Personality and IQ assessment tools have also been made available to employers and these make the screening process a lot smoother and streamlined. Digital filing has helped improve the organization and efficiency in the workplace. Printing costs, paper consumption and space can all be saved thanks to electronic filing systems.
Perhaps the greatest advantage of technology for businesses is the elimination of wastage of time and money. Thanks to videoconferencing and Internet, travel costs can be drastically cut down. A business can set up its presence across the Globe at a fraction of the cost required in the past.
Importance of Technology in Healthcare
The importance of technology in healthcare can be summarized with this single sentence:  “Technology saves lives”. Some of the objectives that healthcare information technology has fulfilled include:
  1. creation of social support networks for patients
  2. self management tools and resources that patients can use with ease
  3. easy access to accurate and actionable health information for patients and families
  4. quick communication and resolution of health risks and public health emergencies
  5. provision of newer opportunities to culturally diverse and hard to access nations
  6. improvement of quality and safety in health care
  7. improved public health infrastructure
  8. facilitation of clinical and consumer decision making
  9. development of health skills and know-how
Health technology has improved organization and efficiency. It has helped eliminate ambiguity and every record right from the billing to diagnostic and treatment can be maintained for easy access for healthcare providers. By using software and hardware tools, profiles of patients can be created so doctors can provide standardized treatment. This helps improve patient outcomes and thus reduces cost of health care.
In conclusion
 It is evident that technology touches each and every facet of our lives. Be it management of employees and inventories, searching answers to complex problems, or facilitating the lives of the sick and disabled, technology has magnificently enhanced the quality of life and also boosted the economy of the world.

Sunday, October 5, 2014

Wi-Fi




Wi-Fi, also spelled Wifi or WiFi, is a local area wireless technology that allows an electronic device to exchange data or connect to the internet using 2.4 GHz UHF and 5 GHz SHF radio waves. The name is a trademark name, and is a play on the audiophile term Hi-Fi. The Wi-Fi Alliance defines Wi-Fi as any "wireless local area network (WLAN) products that are based on the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards". However, since most modern WLANs are based on these standards, the term "Wi-Fi" is used in general English as a synonym for "WLAN". Only Wi-Fi products that complete Wi-Fi Alliance interoperability certification testing successfully may use the "Wi-Fi CERTIFIED" trademark.
Many devices can use Wi-Fi, e.g., personal computers, video-game consoles, smartphones, some digital cameras, tablet computers and digital audio players. These can connect to a network resource such as the Internet via a wireless network access point. Such an access point (or hotspot) has a range of about 20 meters (66 feet) indoors and a greater range outdoors. Hotspot coverage can comprise an area as small as a single room with walls that block radio waves, or as large as many square kilometres achieved by using multiple overlapping access points.
Depiction of a device sending information wirelessly to another device, both connected to the local network, in order to print a document.
Wi-Fi can be less secure than wired connections (such as Ethernet) because an intruder does not need a physical connection. Web pages that use SSL are secure but unencrypted internet access can easily be detected by intruders. Because of this, Wi-Fi has adopted various encryption technologies. The early encryption WEP, proved easy to break. Higher quality protocols (WPA, WPA2) were added later. An optional feature added in 2007, called Wi-Fi Protected Setup(WPS), had a serious flaw that allowed an attacker to recover the router's password. The Wi-Fi Alliance has since updated its test plan and certification program to ensure all newly certified devices resist attacks.
Digital Pen





digital pen is an input device which captures the handwriting or brush strokes of a user, converts handwritten analog information created using "pen and paper" into digital data, enabling the data to be utilized in various applications. For example, the writing data can be digitized and uploaded to a computer and displayed on its monitor. The data can then be interpreted by handwriting software (OCR) and used in different applications or just as graphics.typically contain internal electronics and have features such as touch sensitivity, input buttons, memory, writing data transmission capabilities, and electronic erasers.

Products List

ManufacturerProductUsesAnotopatternRemarks
PolyvisionEnoy
N-trigDuoSense PennActive pen that is bundled with a wide range of touchscreen devices from Microsoft, Sony, Asus, Acer, HTC and others
WacomInkling,na digital pen for drawing
IrislinkIRISnotesn
e-pense-pens product comparisonne-pens is a manufacturer of digital pen technology with a range of products that can be used in conjunction with Windows PCs, Mac OS X, Android, Blackberry, iPhone and iPad, iPod touch devices
LivescribeLivescribey
LogipenNOTES?
LogiTechio (and io2) Digital Peny
MaxellPenity
InfoMax TechnologiesDigital Pen Solution?
NokiaDigital Pen SU-1By
AtaryAtary Digital Pen?can be used as regular pen or as a stylus with a Teflon adapter.
IOGearGPen300n
StaedtlerDigital Penn
wwsolution.inTrackball pennmanufacture by wwsolution.in
Oxford PapershowTrackball penyThe technology involves a USB key to synchronise with the digital pen enabling instant connectivity between pen and PC.
History of Technology



The history of technology is the history of the invention of tools and techniques, and is similar in many ways to the history of humanity. Background knowledge has enabled people to create new things, and conversely, many scientific endeavors have become possible through technologies which assist humans to travel to places we could not otherwise go, and probe the nature of the universe in more detail than our natural senses allow.

Technological artifacts are products of an economy, a force for economic growth, and a large part of everyday life. Technological innovations affect, and are affected by, a society's cultural traditions. They also are a means to develop and project military power.

Many sociologists and anthropologists have created social theories dealing with social and cultural evolution. Some, like Lewis H. Morgan, Leslie White, and Gerhard Lenski, declare technological progress to be the primary factor driving the development of human civilization. Morgan's concept of three major stages of social evolution (savagery, barbarism, and civilization) can be divided by technological milestones, such as fire, the bow, and pottery in the savage era, domestication of animals,agriculture, and metalworking in the barbarian era and the alphabet and writing in the civilization era.
Instead of specific inventions, White decided that the measure by which to judge the evolution of culture was energy. For White "the primary function of culture" is to "harness and control energy." White differentiates between five stages of human development: In the first, people use energy of their own muscles. In the second, they use energy of domesticated animals. In the third, they use the energy of plants (agricultural revolution). In the fourth, they learn to use the energy of natural resources: coal, oil, gas. In the fifth, they harness nuclear energy. White introduced a formula P=E*T, where E is a measure of energy consumed, and T is the measure of efficiency of technical factors utilizing the energy. In his own words, "culture evolves as the amount of energy harnessed per capita per year is increased, or as the efficiency of the instrumental means of putting the energy to work is increased". Russian astronomer, Nikolai Kardashev, extrapolated his theory creating theKardashev scale, which categorizes the energy use of advanced civilizations.
Lenski takes a more modern approach and focuses on information. The more information and knowledge (especially allowing the shaping of natural environment) a given society has, the more advanced it is. He identifies four stages of human development, based on advances in the history of communication. In the first stage, information is passed by genes. In the second, when humans gain sentience, they can learn and pass information through by experience. In the third, the humans start using signs and develop logic. In the fourth, they can create symbols, develop language and writing. Advancements in the technology of communication translates into advancements in the economic system and political systemdistribution of wealthsocial inequality and other spheres of social life. He also differentiates societies based on their level of technology, communication and economy:
  • hunters and gatherers,
  • simple agricultural,
  • advanced agricultural,
  • industrial,
  • special (such as fishing societies).
Finally, from the late 1970s sociologists and anthropologists like Alvin Toffler (author of Future Shock), Daniel Bell and John Naisbitt have approached the theories of post-industrial societies, arguing that the current era of industrial society is coming to an end, and services and information are becoming more important than industry and goods. Some of the more extreme visions of the post-industrial society, especially in fiction, are strikingly similar to the visions of near and post-Singularitysocieties.

Saturday, September 13, 2014

The Oculus Rift








Low Latency 360° Head Tracking

The Rift uses custom tracking technology to provide ultra-low latency 360° head tracking, allowing you to seamlessly look around the virtual world just as you would in real life. Every subtle movement of your head is tracked in real time creating a natural and intuitive experience.




Stereoscopic 3D View

The Oculus Rift creates a stereoscopic 3D view with excellent depth, scale, and parallax. Unlike 3D on a television or in a movie, this is achieved by presenting unique and parallel images for each eye. This is the same way your eyes perceive images in the real world, creating a much more natural and comfortable experience.




Ultra Wide Field of View

The Oculus Rift provides an approximately 100° field of view, stretching the virtual world beyond your peripheral vision. Your view of the game is no longer boxed in on a screen and is only limited by what your eyes can see. The combination of the wide field of view with head-tracking and stereoscopic 3D creates an immersive virtual reality experience.