Friday, February 27, 2009

Intel invests in Ireland

Intel has announced that it will invest over €50m ($63.5m, £44.3m) to expand its Irish operations, a move that will boost employment at its R&D facilities in Shannon by 134 jobs over the next four years.
According to IDA Ireland, the investment will raise total staffing at the facility to 300. Two projects will be supported by the funding, one in hardware and one in software.
The hardware component will include design and validation of 32nm embedded processors for "small to medium sized businesses," according to IDA Ireland. The software project will focus on Intel's QuickAssist Technology, which is aimed at boosting the performance of what Intel refers to as "high-bandwidth and low-latency accelerators."
In a statement commenting on the expansion, Intel Ireland's general manager Jim O’Hara said: "In our highly competitive and rapidly changing global markets, success is hard earned...[This investment makes a genuine contribution to the advancement of Ireland and of Ireland in Europe."
While the addition of 134 employees over four years is, indeed, a "genuine contribution," it's a small one - especially considering that the giant chipmaker is currently executing a layoff plan that will cut 5,000 to 6,000 jobs worldwide, including 200 to 300 at its Leixlip plant in Ireland.
But while one hand cuts the other reaches into its wallet: Intel recently announced that it would invest $7b (£4.9b) over the next two years in the US alone.

Thursday, February 26, 2009

AMD Steps In On Unlocking Phenom X3's

Following up on earlier reports of being able to unlock the disabled core on Phenom II X3 CPU's, techPowerUp says that AMD is stepping in on the issue.
Reports have been made about many X3's having the ability to enable the fourth core onboard, suggesting that AMD simply just marked the core as bad and was too lazy to use a laser to cut the link on the extra core.
While it appears that news pertaining to unlocking the fourth core has caused a surge in demand for the Phenom II X3 CPU's, AMD seems determined to rain down on enthusiasts. AMD has requested motherboard manufacturers to fix the BIOS option immediately and even gone so far as to ask them to not release motherboards with the so-called "buggy" BIOS.
AMD fears they are missing out on profits from the higher priced Phenom II X4 procs. This is probably true but everyone at AMD should have seen this coming when they dangled a part in front of consumers with features disabled that is identical to its higher priced offerings.
While future boards are likely to be released with a newer BIOS having the Advanced Clock Calibration (ACC) option gimped or disabled, you should still be able to flash back to the previous versions to see if you too can get a free core out of your Phenom II X3

AMD demos six-core Istanbul server CPU

AMD recently showed off its upcoming Istanbul six-core server processor that is built using a 45nm process. And while Intel, unsurprisingly, beat AMD to the punch by launching its Dunnington processor back in September of 2008, Istanbul should be the first monolithic six-core processor that will be available for dual- and quad-socket server configurations when it launches in the second half of this year.
The company revealed only a few specifics of the upcoming chip but promised better memory throughput and said the new parts will be thermally and socket compatible with existing AMD servers in the field to allow a seamless upgrade process. Besides the new six-core processor, AMD is planning to launch a new server platform in the later part of 2009 called “Fiorano,” which will use the company’s 45-nm chips along with the latest HyperTransport 3 technology. In the meantime.

Monday, February 23, 2009

iEnterprises CRM Goes On-Demand with Help from IBM

Hope springs anew in the IT business. For the last five to 10 years, there's been plenty of hope in the areas of software as a service (SaaS), open source software, service oriented architecture (SOA), CRM, and mobile computing, but it mostly failed to live up to the hype. Now, as a result of a partnership between iEnterprises and IBM, System i shops may gain access to advanced CRM software that utilize these technologies, without incurring the costs of building it themselves.

iEnterprises is a Murray Hill, New Jersey-based CRM software developer that was founded by John Carini, who once worked as an RPG programmer for several AS/400 shops. As the CEO and chief software architect of iEnterprises, Carini doesn't do a lot of RPG programming these days. But his company and its various CRM products--which are developed and utilize a modern mix of PHP, Java, Domino, WebSphere, Linux, and SQL--do have a fair share of AS/400 customers.
Several months ago, iEnterprises launched its second major CRM product, called Empower. The software covers all the basics you would expect to find in a CRM product, including the capability to manage accounts, contacts, projects, and sales and marketing activities. It helps users capture leads, generate quotes; connects to Outlook or Lotus Notes for e-mail and calendaring; and integrates with ERP systems. The product is also highly customizable, enabling customers to add fields and business logic to satisfy their specific requirements, and it also features so-called "smart client" interfaces for the major mobile phone platforms, iPhone, Blackberry, and Windows Mobile.
The capability to access an enterprise CRM system from a smart phone is definitely cool. But the speedy and flexible deployment options offered by iEnterprises' Empower CRM software is probably the most unique aspect of this particular offering. As a result of the partnership with IBM and its SaaS business partner program, customers can be up and running with their own Empower installation with just a few clicks of a mouse in a Web browser.
The SaaS offering makes a wonderful sales tool, CEO Carini tells IT Jungle. "We actually find the SaaS program a great way to reach either smaller customer or customers or prospects that don't want to make a commitment without seeing what they're going to get," he says. "Maybe we have a System i prospect, but they don't want to jump right in. Maybe they want to try the application and see if it really fits their needs. They can sign up for it, get it as a SaaS offering, and if they like it, they can take it and deploy it behind their firewall, either in their private cloud or on their System i."
As one of only 200 designated SaaS specialty partners in IBM's SaaS partner program, iEnterprises enjoys the considerable backing and access to resources that only Big Blue can provide. To be designated a SaaS specialty partner, a vendor must use IBM exclusively for two out of the following three requirements: hardware, middleware, or hosting services. In iEnterprises' case, it uses IBM hardware and IBM middleware; it contracts with a different company to actually host its customers' Empower installations, which run on xSeries hardware and the LAMP stack.
SaaS is hot for IBM at the moment. The company doubled the number of SaaS specialty partners last year, and is looking for more ISVs to join its overall SaaS community, which numbers more than 3,000. Working with IBM to SaaS-ify applications can be a strategic move for ISVs, according to Dave Mitchell, IBM's director of strategy and emerging business for ISVs and developer relations and the head of the SaaS specialty program.
IBM helps ISVs with SaaS in four areas, Mitchell says. First, the ISVs bring the business applications. Secondly, ISVs get assistance from IBM to design, build, deliver, and market their SaaS solutions. "The third area we play is helping our clients to integrate SaaS into their business," he says. "In recent years we've seen integration become the number one concern around the adoption of SasS solutions." The fourth area is building private "clouds" for customers.
Carini expects the poor economy to bolster demand for SaaS applications such as Empower. "Any time you can give somebody a solution that has less risk, they're more apt to look at it," he says. "The administrator and the entire IT department don't have to spend days figuring out how to get infrastructure set up and how to get an application off the ground. They can literally just click a button and get going with it."

IBM Delivers Software to Ontario Universities via Cloud Computing

IBM (NYSE: IBM) and the Ontario Centres of Excellence today launched a pilot project that gives university students, professors, and other researchers anytime, anywhere access to some of IBM's leading business software via cloud computing.
Cloud computing is an emerging compute model for delivering and consuming IT capabilities as a service. This Tools as a Service (TaaS) technology showcase, which enables researchers to have 24/7 access to IBM's WebSphere Integration Developer and Rational Software Architect over the Internet, is a major milestone for the IBM Canada Centre for Advanced Studies and its partners in the Centre of Excellence for Research in Adaptive Systems (CERAS).
Early adopters of this pilot include the University of Waterloo, York University, Queen's University, University of Toronto, Carleton University, the Ontario Cancer Institute, and developers from the IBM Canada software lab.
This Tools as a Service initiative will provide the participating research institutions with tremendous productivity increases, along with cost, and energy savings. The researchers will have access to some of the latest enterprise software securely without needing to upgrade their own hardware systems or expand their data centers, and without the need for on-site expertise to install and get the tools running. TaaS provides the end user with sufficient computing resources to use the software, regardless of the capability of their own hardware. TaaS is able to preserve users' data from their last session and restore it so they can continue from where they left off with the software tools.
"Traditionally, software development tools were installed and run on workstations and students had access to them only during the Lab hours," says Dr. Marin Litoiu, Computer Science Professor, York University. "Tools as a Service allows the students and researchers to access IBM software tools anytime and from anywhere, using minimal web infrastructure. Besides being very convenient for the end users, Tools as a Service enables better use of university resources, as the same cloud infrastructure can be used for teaching and research, and by many users."
The software is deployed on the CERAS cloud computing infrastructure which is an expandable collection of IBM BladeCenter servers physically located on the campuses of some of the participating universities and virtualized by a virtual machine monitor. All of the students and other researchers participating in this pilot can access the tools on the CERAS cloud through a single web portal.
"Organizations of all kinds need smarter ways to manage their IT infrastructure," says Martin Wildberger, Vice President, IBM Sensor Solutions and Director, IBM Canada Software Lab. "This cloud computing pilot enables the participating research institutions to more efficiently manage their own IT resources, while providing their researchers with easy and much needed access to industry-leading business software tools."
CERAS is a research partnership established between IBM Canada's Centre for Advanced Studies, the Ontario Centres of Excellence, and eight leading universities to advance the development of next-generation software services and applications, including software as a service. The fruitful research results from this collaborative research body were applied in this commercialized solution of TaaS, whose software engineering implementation was done by the IBM Centre for Advanced Studies' Technology Incubation Lab and a post-doctoral fellow and graduate students of the participating universities.
Other university related cloud deployments include the October 2008 IBM and NC State University announcement of the availability of the Virtual Computing Lab, made available to all educational institutions via the Apache open source community.
In addition to this initiative, in October 2007, IBM partnered with Google to launch the Cloud Computing University Initiative -- an effort to promote new software development methods and help students and researchers address the challenges of internet-scale applications in the future. Dozens of universities in the United States now have access to this joint cloud computing cluster.
Ontario Centres of Excellence (OCE) Inc. drives the commercialization of cutting-edge research across key market sectors to build the economy of tomorrow and secure Ontario's global competitiveness.
In doing this, OCE fosters the training and development of the next generation of innovators and entrepreneurs and is a key partner with Ontario's industry, universities, colleges, research hospitals, investors and governments. OCE's five Centres work in communications and information technology, earth and environmental technologies, energy, materials and manufacturing and photonics.

Sunday, February 22, 2009

DSOA has talks with Intel on possible collaboration

Dubai: Shaikh Ahmad Bin Saeed Al Maktoum, Chairman of Dubai Silicon Oasis Authority (DSOA), said on Sunday it has held talks with Paul Otellini, chief executive of Intel Corp.
During the meeting, Shaikh Ahmad and Otellini explored possible collaborations between Emirates airline and Dubai airports; leveraging Intel's advanced IT solutions and services.
Dr Mohammad Al Zarouni also briefed Intel's CEO on the advantages of and facilities offered by DSOA as a high-technology park for the microelectronics and semiconductor industry.

Judge asked to settle Intel-NVIDIA computer chip squabble

SAN FRANCISCO (AFP) — A judge has been asked to settle a squabble between the world's largest chip maker Intel and graphics computing speciality firm NVIDIA.
The northern Californian companies have been arguing for a year about whether a deal they inked in 2004 allows NVIDIA to produce chipsets that work with Intel microprocessors that have integrated-memory controller features.
Intel filed a complaint this week in Chancery Court in the state of Delaware, asking a judge to decide which side is right.
"We got to the point where we said 'this is enough," said Intel spokesman Chuck Malloy.
"We aren't seeking an injunction or asking for damages. It has just been a very longstanding dispute and we couldn't resolve it, so we will let the court decide."
NVIDIA president Jen-Hsun Huang fired back on Wednesday, saying his firm is within the boundaries of its licensing deal with Intel.
He charges that Intel's true goal is to stymie graphics processing unit (GPU) technology that is becoming a competitive threat to computer processing units (CPUs).
"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU," Huang said.
"This is clearly an attempt to stifle innovation to protect a decaying CPU business."
NVIDIA, founded in 1993, became renowned for GPUs that drive sophisticated computer game and video hardware.
Its graphics chips have been evolving to augment and even supplant central processing units (CPUs) at the heart of most computers.
While CPUs typically handle tasks in a linear style, zipping from start to finish in series, GPUs work on tasks simultaneously in order to do things such as get color pixels together on screens to present moving images.
Sets of NVIDIA chips built for speed, power, and superior graphics replaced Intel models in upgraded MacBook laptop computers recently rolled out by Apple. MacBook computing tasks are still done by Intel CPUs.
Acer, Alienware, Asus, Dell, HP, Lenovo, MSI, NEC, and Toshiba are among the computer makers that combine NVIDIA and Intel technologies in hardware.
Intel counters that NVIDIA is telling customers of plans to make chipsets that violate the 4-year-old licensing agreement and that it wants to avoid potential troubles before products are rolled out.
"Rather than wait for something to get in the marketplace and cause customers heartburn and problems, let's get it settled now," Malloy said.
"We hope to get this resolved in a fairly quick fashion."
Intel said that while it is not seeking any damages in court, if it wins it will ask that NVIDIA be ordered to pay its legal costs.

Friday, February 20, 2009

IBM beefs System x with latest Intel, AMD chips

IBM's high-end System x line has been updated with the faster x64 processors from both Intel and Advanced Micro Devices. IBM has also updated the Opteron processors used in its BladeCenter blade servers as well.The "Dunnington" Xeon 7400 series chips were launched last September, sporting four-core and six-core variants aimed at four-socket and larger servers. While it always takes server makers some time to qualify new chips on their machines, for whatever reason, IBM took its time getting the Dunnington chips into is System x boxes.

Considering that Big Blue tends to focus on the high-end of the x64 server space, this might have been a contributing factor in the 32 per cent revenue decline IBM posted in the fourth quarter for the combined System x and BladeCenter designs. IBM has moved with a little more spring in its step to get the quad-core "Shanghai" Opterons from AMD into its System x and BladeCenter machines.The Shanghai chips debuted last November and were updated at the end of January with low-voltage Opteron HE parts and a higher-speed Opteron SE part.IBM's high-end x3950 M2 server can now use Intel's fastest Dunnington Xeon MP chip, the six-core X7460 chip with 9 MB of L2 cache and 16 MB of L3 cache on the chip running at 2.66 GHz.
The x3950 M2 is based on four-socket motherboards. Up to 16 sockets, for a total of 96 cores, can be glued together using IBM's EX4 chipset. IBM will make this faster Xeon MP chip available in the x39650 on March 10. On that same day, the smaller four-socket x3850 M2 server will be available using the quad-core Dunnington E7420 processor running at 2.13 GHz.
This chip has 6 MB of L2 cache and 8 MB of L3 cache.The System x3755 quad-socket server will also be available on March 6 with AMD's standard quad-core Shanghai processors, which run at between 2.4 GHz and 2.7 GHz. The faster Opteron SE and lower-powered Opteron HE parts are not yet available for the x3755, but if you really wanted them, IBM would almost certainly sell them on a special bid basis until it has them formally certified in the boxes.
The quad-core Opteron HE chips in the Shanghai generation are, however, going to be available on March 9 for the company's two-socket LS22 and four-socket LS42 blade servers for its BladeCenter chassis. IBM is shipping the 2.3 GHz Opteron 2376 HE in the LS22 blade and either the 2.2 GHz Opteron 8374 HE or 2.3 GHz Opteron 8376 HE in the LS42 blade.IBM is also shipping 8 GB DDR2 main memory DIMMs for these blades as well to double up the memory density alongside the doubling up of processing capacity compared to the existing dual-core Opteron HE chips.While it is hard to say why IBM's System x and BladeCenter sales slumped in the third quarter and then plummeted in the second quarter, one other factor aside from the lack of Dunnington chips and the impending Shanghai chips was the looming "Nehalem" processors from Intel, which are expected to launch before the end of March.
With the Nehalems offering twice the processor performance and somewhere between three and four times the memory bandwidth of current two-socket Xeon machines, this was probably as big a factor as the economic meltdown. HP's ProLiant server line took a similar hit in the fiscal 2009 first quarter ended January 31, with sales down 22.3 per cent to $2.32bn. IBM swooned further, but HP didn't escape market realities either

AMD spins The Foundry, no thanks to apathetic investors

Yesterday, AMD shareholders gave final approval to spin the company's debt-dependent semiconductor unit into The Foundry Company, placing it under the wing of Abu Dhabi investors at the Advanced Technology Investment Company.
By all means, it's a radical shift in AMD's business – yet oddly the company's investors don't seem to care much about the decision one way or the other.
Last week, AMD had to push back its special shareholder meeting after it couldn't muster up enough votes to make the decision. Only 42 per cent of AMD's stock was represented at the January 15 meeting. To reach quorum, AMD needs a majority.
This time the vote squeaked by with a barely-acceptable 50.26 shares represented. Of the bunch, 94 per cent of shares presented voted in favor of spinning The Foundry Company.
With the terms of the transaction met, the deal is expected to close on or before March 2, 2009. The Foundry Company will be 34.2 per cent owned by AMD with both companies holding equal voting rights.
The spin has been a somewhat bumpy ride for AMD, with Intel antagonism and it's shares de-valued along the way. So what's with light attendance? Are AMD investors really so apathetic or clueless about the company that they can't bother to vote yes or no on a

Thursday, February 19, 2009

Intel to chase clouds with custom mobo More Nehalem goodies

For a little more than a year now, IBM and Dell have been offering custom server designs to hyperscale data center operators that deliver more oomph for a given amount of juice. And it looks like Intel is going to be creating some specialized motherboards for the future "Nehalem" family of Xeon processors so it can get in on the hyperscale action.
As we reported earlier this week, Intel has hitched itself to the cloudwagon and reckons that hyperscale data centers - like those ran by Google, Microsoft, Yahoo, Amazon, Facebook, Salesforce, and so on - will account for anywhere from 20 to 25 per cent of its server chip sales by 2012. At Dell, custom servers sold to such customers already account for a significant share of shipments - at least 10 per cent and maybe as high as 20 per cent, by our estimates.
Intel estimates that as much as 14 per cent of its current server chip shipments go into such big data centers. (That may say more about how SMB server sales are drying up than it does about hyperscale sales these days). As Dell's experience - and the lack of sales into the Google account - shows, such customers want customized products that push the limits of efficiency and performance, and they don't want to pay much of a premium for the the innovation either.
Intel wants to help facilitate the hyperscale build-out and the cloud computing revolution and to that end the company is preparing a new energy-efficient motherboard, called "Willowbrook," for its impending "Nehalem" family of processors.
Jason Waxman, general manager of high density computing within Intel's Server Platforms Group and the executive leading the cloud charge, says that a typical two-socket server used in a cloud has a 250-watt power supply and that it draws maybe 200 watts of juice. With the future custom Nehalem motherboard, Intel will use a slightly different layout for components that "unshadows" components so they are not stacked one behind the other, allowing the board to be cooled more efficiently.
This will also increase the efficiency of voltage regulation so the idle power consumption of a system created using the custom mobo will be under 85 watts. Waxman said that the standard Nehalem boards - and presumably be was talking about two-socket boards based on the "Tylersburg" chipset using the four-core "Nehalem-EP" processors set to launch any day now - burn somewhere between 110 and 115 watts when they are idling.
Waxman did not provide details of the Willowbrook motherboard design, and Intel probably won't say much about it until the Nehalem processor launch. He also did not say if the Willowbrook board was for rack or blade severs. But if Intel is making a custom board for rack servers, it will probably want to do one for blade servers as well.
The unshadowing of components seems simple enough in concept, but proved difficult with the chipset and front side bus architecture of current Xeons. The integrated memory controller on Nehalem chips and the simplified chipset means that components on the board can be moved around so they don't obstruct airflow. At the SC08 supercomputer show back in November, Silicon Graphics was showing off a Nehalem blade board (made by Super Micro, as it turns out).

Manhattan DOM now integrated with IBM technology

MANHATTAN Associates says it has enhanced its Distributed Order Management (DOM) solution, with certified integration with IBM’s technology.DOM is part of the company’s Order Lifecycle Management suite, which gives users a global view of inventory across suppliers, distributors and distribution centres. It now features certified integration with IBM’s Websphere Commerce, extending the supply network.DOM is also certified for the IBM Retail Integration Framework, using open standards to promote communications between services, information sources and business processes. This allows all users to get a complete view of all available inventory.The solution is part of Manhattan Associates’ SCOPE portfolio, allowing access to other applications such as Warehouse Management and Transportation Planning and execution

IBM lobs airplane miles at x64 shops

Server maker IBM has quietly expanded its so-called Power Rewards marketing program, which offers a serving-purchase point system akin to airline frequent flier mile deals. In the past, customers got points if they swapped out RISC server iron in favor of Big Blue's own Power Systems, and now, companies using non-IBM x64 iron can trade up to Power Systems and get points, which can be traded in for software and services.

The Power Rewards program came out concurrent with the lanuch of the Power6-based Power Systems servers back in April 2008. That was when IBM formally and finally converged its System p AIX and Linux boxes with the System i proprietary mid range line, offering AIX, Linux, and i (formerly known as OS/400 and i5/OS) on a single hardware platform. And IBM wanted to use the marketing program to chase rival RISC/Unix boxes.

But since February 10, the company has been offering trade-in points to customers who want to move from x86 and x64 servers made by Hewlett-Packard, Dell, Sun Microsystems, and whitebox vendors.

The original Power Rewards program gave customers swapping out Itanium, Sparc (of various makes), and MIPS processors a certain number of points per processor core that they were swapping out as part of a move to an IBM Power Systems machine. Because IBM was hot to trot to chase the installed base of PA-RISC running HP-UX - who were facing a recompile as they move to Hewlett-Packard's Itanium-based Integrity servers - IBM gave customers with PA-RISC servers 4,000 points per core as they traded to Power.

Sparc, Itanium, Alpha, and MIPS machines were all given 1,000 points per core as a trade-in, which was kind of stingy. Anyway, IBM offered customers a list of 35 pieces of software or services that they could acquire using points; prices for these wares range between 1,000 to 95,000 points. The main one Unix rival shops would care about is a 40-hour block of migration services, which eats up 15,000 points.

Heaven only knows what these services are really worth, which you can see in detail here but the implication is that a point is worth a dollar. The total number of points a customer can turn in cannot exceed more than half the price of the system they acquire as they move off the RISC platform to the IBM iron. So, if you spend $1.5m on a big Power 595 machine, you can only acquire 750,000 points even if you may be turning in enough PA-RISC or Sparc cores to earn more Power Rewards points than that.

Starting last week, IBM is offering 500 points per core for x86 and x64 servers made by its rivals. According to Scott Handy, vice president of marketing and strategy for IBM's Power Systems division, the expansion of Power Rewards into the x86 and x64 area is not a matter of the Unix conversion business running out of gas so much as it is IBM trying to get a third bucket of money to match the two it has from HP and Sun Unix conversions.

"The cost of managing server sprawl is higher than the total cost of acquisition benefits customers got in the first place with their x86 servers," Handy contends. And you can tell that he is not in charge of an x64 server lineup at Big Blue. "VMware has kind of helped because they have made it OK to do x86 server consolidation. Now, my sales guys can make a case for Power."

Officially, the Power Rewards consolidation program cannot be used by customers who have IBM's System x and BladeCenter x64-based servers, even if they want to consolidate workloads onto Power iron. But trust me, if a customer wants to do that, I am pretty certain that the reseller or IBM itself can make such a deal stick.

There is one interesting exception where the Power Rewards deal is being applied to IBM x64 iron. If you're a customer consolidating OS/400 and i5/OS workloads onto BladeCenter dual-core JS12 or quad-core JS22 Power6 blade servers in the BladeCenter S small business chassis, you can get 500 points per x86 or x64 processor core for the Intel or AMD iron you swap out. And you can consolidate your workloads onto x64-based blades in the BladeCenter chassis. You don't have to move workloads to the Power blades. IBM is doing this because it has to.

Depending on who you ask at IBM, somewhere between 85 and 90 per cent of OS/400, i5/OS, and i shops have Windows servers doing all kinds of jobs, and the last thing in the world these companies want to do is move those workloads to Linux partitions using Linux applications that are analogous to the Windows applications they have. They want to run their i and Windows workloads on blades, perhaps. But they are not even remotely interested in changing. They have had the ability to run Linux partitions on their iSeries or System i boxes for eight years, and very few customers have switched from Windows to Linux for infrastructure workloads, despite the benefits of consolidation. ®

Tuesday, February 17, 2009

Callisto with 6MB of L3 cache

AMD has another surprise for Intel. It's preparing a CPU codenamed Callisto and this is a new CPU that we haven’t seen on the roadmap before. The Heka three-core is nothing more than Deneb quad-core without one core, and with this in mind AMD wants to do the same with the new dual-core. The CPU, codenamed Callisto, is a 45nm dual-core with 1MB L2 and a massive 6MB of L3 cache. This is an AM3 CPU that is harvested from the Deneb core and despite the fact that it might be quite a big chip, it should give dual-core Pentium CPUs a run for its money. You can expect this CPU in Q2 2009 and we are sure that this CPU might give Intel's Core 2 dual-core generation a lot to fight for.

IBM's focus on Green continues

IBM says that green IT is growing rapidly among large and small enterprises. IBM expects its green offerings such as the SMDC (Scalable Modular Data Center) and our EMDC (Enterprise Modular Data Center) designs will lead the way for green initiatives across the country.
IBM says that green IT is growing rapidly among large and small enterprises. IBM expects its green offerings such as the SMDC (Scalable Modular Data Center) and our EMDC (Enterprise Modular Data Center) designs will lead the way for green initiatives across the country.
In a recent study conducted by IBM across eight regions including India, small and medium businesses cited energy costs as one of the biggest cost increase over the past two years and are looking at green technologies as an alternative.'Green computing is looking very bright and is catching up fast among corporations. If you look at the reasons why organizations are going green, you will find out that it varies on organizations goals like financial gains, operational enhancements or environmental issues,'said Aditya Singhal, Country Manager, Infrastructure Services, IBM India/S Asia.
The latest IDC's Green Poll conducted in Asia pacific stated that 81 percent of organizations thought that the 'greenness' of their IT suppliers would become 'much more important' over the next few years. 'Our focus for 2009 is on designing innovative products and programs to improve energy efficiency, cut costs and counter global warming,' said Singhal.
Today, 18 percent of the organizations consider the greenness of the IT suppliers before making a selection and another 30 percent expected to do so in the near future. Many clients tell us that going green is simply good business,' he added.
IBM last year deployed Scalable Modular Data Centers for over a dozen of its clients. Some of these clients are Fiat India Automobiles, Bharat Bijlee, Atlas Copco, Apollo Sindhoori, APL, FINO, and Religare.
According to Singhal, IBM's SMDCs have a strong Green component and are primarily targeted at mid-sized enterprises whose business needs demand resiliency but in a cost-effective manner.
'We've ourselves realized the benefits of going Green. We've consolidated 3000 servers in to just 30 System z mainframes. We've consolidated our 155 data centers across the world in to just 7 data centers. The new setup consumes 80 percent less energy than earlier which saves enough energy to power a small town,' added Singhal.

Intel bundles CPUs with SSDs for discount

Intel wants to get ahead in the emerging SSD market and seems to believe they can do it by going after channel partners as opposed to individual customers. Starting with the Core i7, Intel has been offering steep discounts on hardware prices for vendors who will pick up both a Core i7 CPU and SSD hard drives. Targeting North America, Europe and China, Intel is offering discounts as high as 15% on the combined price of SSD drives and Core i7 CPUs, provided they are ordered together.
That's a pretty significant discount, especially when applied to bulk orders. Whether or not vendors will then pass on that discount to their customers remains to be seen, but it'll likely encourage laptop manufacturers to include SSDs more often. Hopefully they'll be able to offer SSD-coupled laptops at a lower premium than they are currently, as it would certainly help the SSD market grow.

Restore intelligence to intel gathering

It is perhaps now common wisdom that throughout the past eight years, the Bush-Cheney administration too often abused the intelligence process, used intelligence as a political tool, disregarded the laws that govern intelligence collection and circumvented the congressional oversight committees, restricting their access to key information.
The level of distrust between the Bush-Cheney administration and Congress over intelligence matters was appalling. The effect on public confidence, and on the morale of our intelligence professionals, was devastating and entirely unnecessary.
But change is coming. At long last, the Obama administration brings the opportunity for our country to chart a new course and restore integrity and credibility to intelligence processes that have been profoundly damaged. As chairman, vice chairman and a member of the Senate Intelligence Committee for eight years, and a U.S. senator for 24 years, I cherish deeply the responsibility bestowed upon me to protect and defend the freedoms of the American people and to keep our nation secure.
If there is a single principle that these past eight years reaffirmed for me, it is that congressional oversight of intelligence activities is absolutely fundamental to fulfilling our duties.
Congress is the only independent reviewer of the legality and effectiveness of secret intelligence activities and the only independent check on programs that are not always in the overall interest of our national security or our democracy.
Yet Congress is wholly dependent on the executive branch for information on the intelligence activities we oversee. And we can act effectively only when all members of the bipartisan House and Senate intelligence committees are informed fully and promptly, as required by law. During the Bush-Cheney era, we saw affirmative efforts to prevent Congress from performing true oversight on matters ranging from warrantless wiretapping to the decision to go to war in Iraq.
One of the most notable abuses was the CIA’s secret detention and interrogation program, which was kept from the full oversight committees for four years, until September 2006 — just a few hours before President George W. Bush publicly disclosed its existence and announced that all the CIA detainees were being transferred to Guantanamo Bay. As with many other programs, the Bush-Cheney administration then disingenuously declared to the public that Congress had been fully informed — entirely distorting the limited, flip-chart briefings given to just eight members of Congress and with strict prohibitions against consulting other members of Congress or the legal and intelligence experts on the committees. In effect, the Bush-Cheney administration twisted and expanded a narrow legal exception for notifications regarding extremely sensitive covert actions into standard operating procedure for any controversial program. The so-called gang of eight notifications, originally intended to protect American lives during ongoing covert operations, became a cynical mechanism to prevent congressional oversight and mislead the public.
Now, outgoing senior Bush-Cheney officials are as eager as ever to tell their same old story. But their record of dodging accountability for their intelligence activities is clear, and it will become even more troubling as we are able to unearth more of the relevant documents and decisions. The Obama administration will face difficult decisions as it moves to undo damaging policies, but it brings to the table a strong commitment to government transparency and inclusion. As outgoing chairman of the Senate Intelligence Committee, I am proud that we exposed the Bush-Cheney administration’s unforgivable spin of the intelligence regarding Iraq; that we created a safe harbor for the intelligence community to offer honest analysis without regard to the White House’s political aims; that we brought the administration’s warrantless wiretapping program under a system of law that protects Americans’ civil liberties; and that we insisted upon briefings to the full membership of the House and Senate Intelligence committees regarding interrogation and detention practices. These were tough battles, fought — with mixed success — largely behind the scenes and without ever breaching classification rules or threatening our national security. Moving forward, we need partnership, rather than confrontation. The intelligence community must brief the entire membership of the intelligence committees on all noncovert action intelligence activities. There is no basis in law, and no legitimate rationale, for withholding information about intelligence collection programs authorized and overseen by Congress. Moreover, Congress, through the intelligence committees, must be consulted before new covert action findings are issued. Raising questions and working through them in advance are the only way to produce sound policies boosted by congressional support. And once the president authorizes a covert action, he should inform the entire committee.
The only exceptions should be the very rare instances when the president uses his Gang of Eight authority for a limited period of time to protect an ongoing operation where lives are in immediate danger. As we enter a new period committed to openness and change and bid farewell to an era of obscurity and dishonesty, there is the potential for great progress to be made. The trust between the executive branch and Congress has been breached, and the trust and confidence of the American people have eroded. But I remain confident that, if we restore the vital role of Congress in overseeing our intelligence activities, we can bridge the divide, restore integrity and get back to the business of lawfully and effectively securing this great nation.

Intel, ARM, nVidia Show Off Mobile Chops in Barcelona

Monday is a national holiday in the U.S., but it will be a busy day in Barcelona, Spain, which is hosting GAMA Mobile World Congress -- one of the largest mobile phone trade conferences.

More than 1,300 firms will be represented at the show, and three chipmunks -- Intel (NASDAQ: INT C), ARM and nVidia (NASDAQ: NVDA) -- will be making announcements of new chip products or extended support.

ARM, best known as the developer of an embedded processor that licensees can modify for their own product needs, will announce it is making the move to 32 nanometers (nm) with a high-k metal gate process.
Both represent advanced technology that ARM can take advantage of because it's a member of IBM's IBM Common Platform alliance, which also includes AMD. ARM Partners will have access to the technology in 2009, with full production release in early 2010.

The 32nm design will be used in the Cortex-A9 processor, which can be tied together at a microarchitecture level to yield a multi-core configuration. ARM licensees can, if they want, connect up to four Cortex-A9 cores and make their own quad-core product, depending on where it is used. That could add even greater flexibility to a processor core already known for the large amounts that licensees already often add to it. nVidia's Tegra chip, for instance, uses the ARM 11 design along with many of nVidia's own ingredients.

"This new design means we can offer processors that are faster, smaller and consume less power," said James Bruce, manager of North American mobile solutions for ARM. "These process leaps really do help us and help the phone industry deliver features that weren't possible a generation or two before."

Cortex licensees get the Physical IP prototype libraries as well as test structures for validating the technology. The core IP includes logic, memory and interface products.


Thursday, February 12, 2009

Al-Qaeda 'less capable and effective': US intel chief

Al-Qaeda is "less capable and effective" than it was a year ago after a series of damaging blows that have killed key leaders in Pakistan's tribal areas, the new US intelligence chief said.
Nevertheless, retired admiral Dennis Blair, the director of national intelligence, said Al-Qaeda was still planning attacks on the West and is believed to view Europe as a "viable launching point."
In an annual threat assessment to Congress, Blair also highlighted growing Al-Qaeda threats in Yemen, East Africa and North Africa.
But in an allusion to US missile strikes this year in Pakistan's tribal areas, he said: "Al-Qaeda lost significant parts of its command structure since 2008 in a succession of blows as damaging to the group as any since the fall of the Taliban in late 2001.
Because of the pressure we and our allies have put on Al-Qaeda's core leadership in Pakistan and the continued decline of Al-Qaeda's most prominent regional affiliate in Iraq, Al-Qaeda today is less capable and effective than it was a year ago," he said.
He said the loss of key leaders in quick succession "has made it more difficult for Al-Qaeda to identify replacements, and in some cases the group has had to promote more junior figures considerably less skilled and respected than the individuals they are replacing."
The US missile strikes have aroused vehement opposition in Pakistan amid charges of civilian deaths and warnings by the government that the US campaign is "counter-productive."
And the report gave no indication that the cross-border campaign was helping to curb violence in either Afghanistan or Pakistan.

On the contrary, it said the Taliban-dominated insurgency in Afghanistan has expanded in geographic scope and aggressiveness, moving into previously peaceful areas around Kabul and challenging the central government.

In Pakistan, the government was "losing authority in parts of the North-West Frontier Province and has less control of its semi-autonomous tribal areas."
Blair's report, however, said that sustained pressure on Al-Qaeda in the tribal areas "has the potential to further degrade its organizational cohesion and diminish the threat it poses.
"
If forced to leave the safe havens, Al-Qaeda would have to adopt a more dispersed, clandestine structure that would make training and operational coordination more difficult, it said.
"It is conceivable Al-Qaeda could relocate elsewhere in South Asia, the Gulf, or parts of Africa where it could exploit a weak central government and close proximity to established recruitment, fundraising, and facilitation networks," the report said.

"But we judge none of these locations would be as conducive to their operational needs as their location in the FATA (federally administered tribal areas)," it said.
At the same time, however, Blair noted Al-Qaeda threats arising in other areas.
Yemen is reemerging as a jihadist battleground and potential regional base of operations for Al-Qaeda to plan internal and external attacks, train terrorists, and facilitate the movement of operatives," his assessment said.
Saudi Arabia, which has dealt harshly with the group since 2003, "is now facing new external threats from Al-Qaeda elements in the region, particularly from Yemen," he said.
Despite the death or capture of most Saudi-based Al-Qaeda leaders or operatives over the past five years, senior Al-Qaeda leaders view the oil-rich kingdom as a strategic target and they remain intent on resurrecting an operational presence there.
The report also noted the threat posed by Al-Qaeda and other Sunni affiliates returning to Europe from training in Pakistan.

"We have had limited visibility into European plotting, but we assess that Al-Qaeda is continuing to plan attacks in Europe and the West," it said.
"Al-Qaeda has used Europe as a launching point for external operations against the homeland on several occasions since 9/11, and we believe the group continues to view Europe as a viable launching point," it said.
It said Denmark and Britain remain "viable targets" of Al-Qaeda attacks, and France has been prominently mentioned by Al-Qaeda leaders, making it a possible target in reprisal for a 2004 ban on headscarves.

"Increased security measures at home and abroad have caused Al-Qaeda to view the West, especially the United States, as a harder target than in the past, but we remain concerned about an influx of Western recruits into the tribal areas since mid-2006," it said.

Intel Sets Example for Small Businesses

Good economic news is rare these days. So when a major U.S. company promises to create thousands of high-paying jobs, even the president gives thanks.
That's what happened to Intel(INTC Quote - Cramer on INTC - Stock Picks) this week. The company said it will spend $7 billion over the next two years to build high-tech chip-manufacturing plants, prompting a "way-to-go" phone call from President Obama to Intel CEO Paul Otellini. Like other Fortune 500 companies, Intel has seen demand for its products shrink in recent months. But they're still taking a leap of faith and planning for the future. What about your company? Times are tough and money is tight. But if you focus too much on day-to-day survival instead of long-term goals, you risk falling behind.

Otellini clearly wanted to make a national splash with his announcement, which is why he made it at the Economic Club of Washington, D.C., rather than at Intel's California headquarters. The jobs, described as high-wage and high-skill, will be filled at existing manufacturing sites in Oregon, Arizona and New Mexico.

"I am pleased to announce our intention to stamp the words 'made in America' on even more Intel products in the months and years to come," Otellini said.
With all the talk of economic-stimulus plans, U.S. consumers are becoming more receptive to "Buy American" pitches. And Intel is eager to make its case. Although Intel generates more than 75% of its sales overseas, the company spends most of its manufacturing and research-and-development budget in the U.S.

Windows 7:What People Will Like and What They Won't

As I've been using Windows 7 over the last couple of months, I've become convinced that most users will find a lot to like in the new operating system. My guess is that it will get a much warmer reception than Windows Vista did. But there are a lot of changes--more than you might think--and I'm sure there will be people who won't be happy with all of them.
But let's start with the things I think everyone will like. It's faster. At least in Beta 1, the system seems to boot faster on my system. In benchmarks, I don't see any significant performance difference in running applications on Windows 7 as compared with Vista SP1. But in practice, it just feels a bit more responsive, and it seems to boot faster on my system.
Now, every OS seems to degrade in performance over time, as you add more updates and applications, so I can't be sure this will be the case in the production version, but for now I'm quite happy.
More after the jump.

It takes fewer resources. Windows 7 is lighter than Windows XP and will even run on netbooks, machines where Vista is a non-starter. That's a win for everyone: users, PC makers, and Microsoft.

It's less annoying:By default, the user account control feature no longer asks your permission for things you are changing to the system, only for things that other programs are changing. That means it pops up a whole lot less often, and as a result, my guess is people will pay more attention to it. That's good. Some people are worried that it means that if you install a rogue program on your system, it could do more damage, but that may be an unavoidable risk--once a rogue program is installed, you're in trouble anyway.

The new askbar is very useful: The Windows 7 taskbar works quite differently from the one in Windows Vista or Windows XP. Now you can "pin" programs to the taskbar, creating a row of programs (as on the current Macintosh) that will either switch to open windows in that program or launch the program if it is closed.

Hovering the mouse over the icon shows active thumbnails of what the open windows look like, and right-clicking brings up a list of the most recent items you've opened in the applications, including Web sites in IE and frequent locations in Windows Explorer. A tiny space in the far right-hand corner hides all the windows and makes the desktop visible.

This sounds like a small change, but in practice, it's a really nice enhancement. It may take a little bit of getting used to, but I suspect nearly all Windows users will get used to this and like it. If you don't like it, you'll have options for changing it back to the way it worked under Vista. ExtremeTech has a good preview here with all the details.It has an improved backup program.
I've generally been happier with the programs that come bundled with most external hard drives than the one that came with earlier versions of Windows, but Windows 7 has a perfectly nice, easy-to-use backup program designed to back up its settings and your files to an external hard drive on a simple schedule. It's not continuous backup, but it's much better than what was included in previous versions. I'm still amazed that Apple was able to sell Leopard mostly on the basis of its Time Machine backup, and I'm constantly reminding people to back up, so this can only be a good thing.

There are other things that offer potential, but will need support from third parties to be useful. One of these is DeviceStage: This gives you a photo of the printer, camera, or other device that can appear in the task bar and give you direct access to what the device can do. It sounds cool, and if it really works, I expect people will like it, but it will need a lot more device support to be useful.
It won't be completely compatible. No new operating system ever is. Even though Windows 7 appears to be quite similar to Windows Vista in terms of compatibility, and has a number of nice features to make compatibility easier, you know it won't run everything. And that will always be a problem, which is why I bet companies want more time working with it before deciding on an upgrade path. (I'll have more on compatibility tomorrow).

It's just different. Again, people are used to Windows XP or even Vista. And again, Microsoft has moved some things around and that will take some getting used to.
Overall, my guess is that most people are going to be very happy with Windows 7. It solves most of the issues we've had with Vista; and it's faster, leaner, and just better looking. But I'm sure there will be people who won't like it--indeed, who will just prefer XP because it's what they are familiar with. Nothing Microsoft could do would please

Wednesday, February 11, 2009

XHTML

Stands for "Extensible Hypertext Markup Language." Yes, apparently "Extensible" starts with an "X." XHTML is a spinoff of the hypertext markup language (HTML) used for creating Web pages. It is based on the HTML 4.0 syntax, but has been modified to follow the guidelines of XML, the Extensible Markup Language. Therefore, XHTML 1.0 is sometimes referred to as HTML 5.0.

Because XHTML is "extensible," Web developers can create their own objects and tags for each Web page they build. This gives the developers more control over the appearance and organization of their Web pages. The only requirement is that the custom tags and attributes are defined in a document type definition (DTD), that is referenced by the XHTML page.

XHTML pages must also conform to a more strict syntax than regular HTML pages. While Web browsers are rather lenient and forgiving of HTML syntax, XHTML pages must have perfect syntax. This means no missing quotes or incorrect capitalization in the markup language. While the strict syntax requires more meticulous Web page creation, it also ensures Web pages will appear more uniform across different browser platforms

Virtual Memory

Memory is hardware that your computer uses to load the operating system and run programs. It consists of one or more RAM chips that each have several memory modules. The amount of real memory in a computer is limited to the amount of RAM installed. Common memory sizes are 256MB, 512MB, and 1GB.

Because your computer has a finite amount of RAM, it is possible to run out of memory when too many programs are running at one time. This is where virtual memory comes in. Virtual memory increases the available memory your computer has by enlarging the "address space," or places in memory where data can be stored. It does this by using hard disk space for additional memory allocation. However, since the hard drive is much slower than the RAM, data stored in virtual memory must be mapped back to real memory in order to be used.

The process of mapping data back and forth between the hard drive and the RAM takes longer than accessing it directly from the memory. This means that the more virtual memory is used, the more it will slow your computer down. While virtual memory enables your computer to run more programs than it could otherwise, it is best to have as much physical memory as possible. This allows your computer to run most programs directly from the RAM, avoiding the need to use virtual memory. Having more RAM means your computer works less, making it a faster, happier machine.

Raster Graphic

Most images you see on your computer screen are raster graphics. Pictures found on the Web and photos you import from your digital camera are raster graphics. They are made up of grid of pixels, commonly referred to as a bitmap. The larger the image, the more disk space the image file will take up. For example, a 640 x 480 image requires information to be stored for 307,200 pixels, while a 3072 x 2048 image (from a 6.3 Megapixel digital camera) needs to store information for a whopping 6,291,456 pixels.

Since raster graphics need to store so much information, large bitmaps require large file sizes. Fortunately, there are several image compression algorithms that have been developed to help reduce these file sizes. JPEG and GIF are the most common compressed image formats on the Web, but several other types of image compression are available.

Raster graphics can typically be scaled down with no loss of quality, but enlarging a bitmap image causes it to look blocky and "pixelated." For this reason, vector graphics are often used for certain images, such as company logos, which need to be scaled to different sizes.
File extensions: .BMP, .TIF, .GIF, .JPG

Password

A password is a string of characters used for authenticating a user on a computer system. For example, you may have an account on your computer that requires you to log in. In order to successfully access your account, you must provide a valid username and password. This combination is often referred to as a login. While usernames are generally public information, passwords are private to each user.

Most passwords are comprised of several characters, which can typically include letters, numbers, and most symbols, but not spaces. While it is good to choose a password that is easy to remember, you should not make it so simple that others can guess it. The most secure passwords use a combination of letters and numbers and don not contain actual words.

Laptop

Laptop computers, also known as notebooks, are portable computers that you can take with you and use in different environments. They include a screen, keyboard, and a trackpad or trackball, which serves as the mouse. Because laptops are meant to be used on the go, they have a battery which allows them to operate without being plugged into a power outlet. Laptops also include a power adapter that allows them to use power from an outlet and recharges the battery.

While portable computers used to be significantly slower and less capable than desktop computers, advances in manufacturing technology have enabled laptops to perform nearly as well as their desktop counterparts. In fact, high-end laptops often perform better than low or even mid-range desktop systems. Most laptops also include several I/O ports, such as USB ports, that allow standard keyboards and mice to be used with the laptop. Modern laptops often include a wireless networking adapter as well, allowing users to access the Internet without requiring any wires.

While laptops can be powerful and convenient, the convenience often comes at a price. Most laptops cost several hundred dollars more than a similarly equipped desktop model with a monitor, keyboard, and mouse. Furthermore, working long hours on a laptop with a small screen and keyboard may be more fatiguing than working on a desktop system. Therefore, if portability is not a requirement for your computer, you may find better value in a desktop model.

Hard Drive

The hard drive is what stores all your data. It houses the hard disk, where all your files and folders are physically located. A typical hard drive is only slightly larger than your hand, yet can hold over 100 GB of data. The data is stored on a stack of disks that are mounted inside a solid encasement. These disks spin extremely fast (typically at either 5400 or 7200 RPM) so that data can be accessed immediately from anywhere on the drive.

The data is stored on the hard drive magnetically, so it stays on the drive even after the power supply is turned off.The term "hard drive" is actually short for "hard disk drive." The term "hard disk" refers to the actual disks inside the drive. However, all three of these terms are usually seen as referring to the same thing -- the place where your data is stored. Since I use the term "hard drive" most often, that is the correct one to use.

Hardware

Computer hardware refers to the physical parts of a computer and related devices. Internal hardware devices include motherboards, hard drives, and RAM. External hardware devices include monitors, keyboards, mice, printers, and scanners.

The internal hardware parts of a computer are often referred to as components, while external hardware devices are usually called peripherals. Together, they all fall under the category of computer hardware. Software, on the other hand, consists of the programs and applications that run on computers. Because software runs on computer hardware, software programs often have system requirements that list the minimum hardware required for the software to run.

Hard Disk

When you save data or install programs on your computer, the information is typically written to your hard disk. The hard disk is a spindle of magnetic disks, called platters, that record and store information. Because the data is stored magnetically, information recorded to the hard disk remains intact after you turn your computer off. This is an important distinction between the hard disk and RAM, or memory, which is reset when the computer's power is turned off.

The hard disk is housed inside the hard drive, which reads and writes data to the disk. The hard drive also transmits data back and forth between the CPU and the disk. When you save data on your hard disk, the hard drive has to write thousands, if not millions, of ones and zeros to the hard disk. It is an amazing process to think about, but may also be a good incentive to keep a backup of your data.

Facebook

Facebook is a social networking website that was originally designed for college students, but is now open to anyone 13 years of age or older. Facebook users can create and customize their own profiles with photos, videos, and information about themselves. Friends can browse the profiles of other friends and write messages on their pages.

Each Facebook profile has a "wall," where friends can post comments. Since the wall is viewable by all the user's friends, wall postings are basically a public conversation. Therefore, it is usually best not to write personal messages on your friends' walls. Instead, you can send a person a private message, which will show up in his or her private Inbox, similar to an e-mail message.
Facebook allows each user to set privacy settings, which by default are pretty strict.

For example, if you have not added a certain person as a friend, that person will not be able to view your profile. However, you can adjust the privacy settings to allow users within your network (such as your college or the area you live) to view part or all of your profile. You can also create a "limited profile," which allows you to hide certain parts of your profile from a list of users that you select. If you don't want certain friends to be able to view your full profile, you can add them to your "limited profile" list.

Another feature of Facebook, which makes it different from MySpace, is the ability to add applications to your profile. Facebook applications are small programs developed specifically for Facebook profiles. Some examples include SuperPoke (which extends Facebook's "poke" function) and FunWall (which builds on the basic "wall" feature). Other applications are informational, such as news feeds and weather forecasts. There are also hundreds of video game applications that allow users to play small video games, such as Jetman or Tetris within their profiles. Since most game applications save high scores, friends can compete against each other or against millions of other Facebook users.

Facebook provides an easy way for friends to keep in touch and for individuals to have a presence on the Web without needing to build a website. Since Facebook makes it easy to upload pictures and videos, nearly anyone can publish a multimedia profile. Of course, if you are a Facebook member or decide to sign up one day, remember to use discretion in what you publish or what you post on other user's pages. After all, your information is only as public as you choose to make it!

Excel

Microsoft Excel is a spreadsheet program for Windows and Macintosh computers. It is part of the Microsoft Office suite, which includes other productivity programs, such as Word and PowerPoint.
Though Excel is developed by Microsoft, the first version of the program was released for the Macintosh in 1985. It wasn't until 1987, when Microsoft introduced Windows 3.0, that Excel was made available for Windows. Since then, Microsoft has supported the program on both platforms, releasing updates about every two years.

Some other popular spreadsheet programs include IBM Lotus 1-2-3 (for Windows) and the AppleWorks spreadsheet program (for the Mac). However, Microsoft Excel has led the spreadsheet market for many years and continues to be the most popular spreadsheet program for both businesses and consumers.

E-Mail

It's hard to remember what our lives were like without e-mail. Ranking up there with the Web as one of the most useful features of the Internet, e-mail has become one of today's standard means of communication. Billions of messages are sent each year. If you're like most people these days, you probably have more than one e-mail address. After all, the more addresses you have, the more sophisticated you look...E-mail is part of the standard TCP/IP set of protocols. Sending messages is typically done by SMTP (Simple Mail Transfer Protocol) and receiving messages is handled by POP3 (Post Office Protocol 3), or IMAP (Internet Message Access Protocol).

IMAP is the newer protocol, allowing you to view and sort messages on the mail server, without downloading them to your hard drive.Though e-mail was originally developed for sending simple text messages, it has become more robust in the last few years. Now, HTML-based e-mail can use the same code as Web pages to incorporate formatted text, colors, and images into the message. Also, documents can be attached to e-mail messages, allowing files to be transfered via the e-mail protocol. However, since e-mail was not originally designed to handle large file transfers, transferring large documents (over 3 MB, for example) is not allowed by most mail servers. So remember to keep your attachments small!

External Hard Drive

Nearly all personal computers come with an internal hard drive. This drive stores the computer's operating system, programs, and other files. For most users, the internal hard drive provides enough disk space to store all the programs and files. However, if the internal hard drive becomes full or if the user wants to back up the data on the internal hard drive, and external hard drive may be useful.

External hard drives typically have one of two interfaces ? USB or Firewire. USB hard drives commonly use the USB 2.0 interface because it supports data transfer rates of up to 480 Mbps. USB 1.1 only supports transfers of up to 12 Mbps, which would make the hard drive seem slow to even the most patient people. Firewire drives may use either Firewire 400 or Firewire 800, which support data transfer rates of up to 400 and 800 Mbps respectively.

The most likely users to need external hard drives are those who do audio and video editing. This is because high-quality media files can fill up even the largest hard drives. Fortunately, external hard drives can be daisy chained, which means they can be connected one after the other and be used at the same time. This allows for virtually unlimited amounts storage.
Users who do not require extra storage may still find external hard drives useful for backing up their main hard drive. External hard drives are a great backup solution because they can store an exact copy of another hard drive and can be stored in a safe location. Using the drive to restore data or perform another backup is as simple as connecting it to the computer and dragging the necessary files from one drive to another.

While most external hard drives come in heavy, protective cases, some hard drives are designed primarily for portability. These drives usually don't hold as much data as their larger desktop counterparts, but they have a sleek form factor and can easily be transported with a laptop computer. Some portable drives also include security features such as fingerprint recognition that prevent other people from accessing data on the drive in case it is lost.

Desktop

Your computer's desktop is much like a physical desktop. You probably keep a number of commonly used items on your desk such as pens, papers, folders, and other items. Your computer's desktop serves the same purpose -- to give you easy access to items on your hard drive. It is common to store frequently used files, folders, and programs on your desktop.

This allows you to access the items quickly instead of digging through the directories on your hard drive each time you want to open them.Both the Macintosh and Windows interfaces use the desktop as a central part of the interface. Both operating systems allow you to move items on and off the desktop as you wish and offer organization tools to arrange and clean up the items on the desktop. Yes, it would be nice if there was an option like that for a real-life desktop. You can also customize your computer's desktop with the pattern or background image of your choice. For more information on customizing your desktop.

Card Reader

Card reader" is the generic term for an input device that reads flash memory cards. It can be a standalone device that connects to a computer via USB or it may be integrated into a computer, printer, or multifunction device. In fact, most multifunction printer/scanner/copiers now have built-in card readers.

Most card readers accept multiple memory card formats, including compact flash (CF), secure digital (SD), and Sony's Memory Stick. Some card readers accept various other formats such as XD, SmartMedia, Microdrive, and Memory Stick Pro Duo cards.

The purpose of a card reader is, not surprisingly, to read the data from a memory card. When you place a memory card into a card reader, it will often show up on your computer as a mounted disk. You can then view the contents of the memory card by double-clicking the card's icon. This icon typically appears on the desktop of Macintosh computers or inside "My Computer" on Windows machines.

Since memory cards most often contain pictures from digital cameras, a photo organization program may automatically open when you insert a memory card into you card reader. This provides an easy way of importing your pictures into your photo album. If you don't want to import photos using the program, you can simply close the program and the card will still be mounted on your computer.

Once you decide to remove the card, make sure you unmount or "eject" the disk before physically removing the card. This will help prevent the data on the card from becoming corrupted.

Database

This is a data structure used to store organized information. A database is typically made up of many linked tables of rows and columns. For example, a company might use a database to store information about their products, their employees, and financial information. Databases are now also used in nearly all e-commerce sites to store product inventory and customer information.

Database software, such as Microsoft Access, FileMaker Pro, and MySQL is designed to help companies and individuals organize large amounts of information in a way where the data can be easily searched, sorted, and updated.While the first databases were relatively "flat" (limited to simple rows and columns), today's relational databases allow users to access, update, and search information based on the relationship of data in one database to another. Certain databases even let users store data such as sound clips, pictures, and videos

Cable Modem

A cable modem is used for connecting to the Internet and is much faster than a typical dial-up modem. While a 56K modem can receive data at about 53 Kbps, cable modems support data transfer rates of up to 30 Mbps. That's over 500 times faster. However, most ISPs limit their subscribers' transfer rates to less than 6 Mbps to conserve bandwidth.

Another important way that a cable modem is different than a dial-up modem is that it doesn't connect to a phone line. Instead, the cable modem connects to a local cable TV line, hence the term "cable modem." This allows cable modems to have a continuous connection to the Internet. Therefore, there is no need to dial your ISP every time you want to check your e-mail.
Cable modems, which have a much more complex design than dial-up modems, are usually external devices, but some models can be integrated within a computer. Instead of connecting to a serial port like a external dial-up modem, cable modems attach to a standard Ethernet port so they can transfer data at the fastest speed possible.

Batch Process

As most computer users know, some computing tasks can be tedious and repetitive. Fortunately, if a task is indeed repetitive, a batch process can be used to automate much of the work.
A batch process performs a list of commands in sequence. It be run by a computer's operating system using a script or batch file, or may be executed within a program using a macro or internal scripting tool. For example, an accountant may create a script to open several financial programs at once, saving him the hassle of opening each program individually.
This type of batch process would be executed by the operating system, such as Windows or the Mac OS. A Photoshop user, on the other hand, might use a batch process to modify several images at one time. For example, she might record an action within Photoshop that resizes and crops an image. Once the action has been recorded, she can batch process a folder of images, which will perform the action on all the images in the folder.
Batch processing can save time and energy by automating repetitive tasks. While it may take awhile to write the script or record the repetitive actions, doing it once is certainly better than having to do it many times.

Access

Microsoft Access, often abbreviated "MS Access," is a popular database application for Windows. Access allows users to create custom databases that store information in an organized structure. The program also provides a visual interface for creating custom forms, tables, and SQL queries. Data can be entered into an Access database using either visual forms or a basic spreadsheet interface. The information stored within an Access database can be browsed, searched, and accessed from other programs, including Web services.

While Access is a proprietary database management system (DBMS), it is compatible with other database programs since it supports Open Database Connectivity (ODBC). This allows data to be sent to and from other database programs, such as MS SQL, FoxPro, Filemaker Pro, and Oracle databases. This compatibility also enables Access to serve as the back end for a database-driven website. In fact, Microsoft FrontPage and Expression Web, as well as ASP.NET have built-in support for Access databases. For this reason, websites hosted on Microsoft Windows servers often use Access databases for generating dynamic content.

JAVA

While most of the world uses "Java" as another term for coffee, the computer science world uses it to refer to a programming language developed by Sun Microsystems. The syntax of Java is much like that of C/C++, but it is object-oriented and structured around "classes" instead of functions. Java can also be used for programming applets -- small programs that can be embedded in Web sites.

The language is becoming increasingly popular among both Web and software developers since it is efficient and easy-to-use.Java is a great programming language, but like Reading Rainbow says, you don't have to take my word for it. Sun Micorsystems describes Java as a "simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multithreaded, dynamic, buzzword-compliant, general-purpose programming language." And it removes stains like magic.

Sunday, February 1, 2009

A Hundred Years of Invention - The First Computer

There's been a controversy in the computing world when discussing what was the first computer invented.
For years, the accepted pioneer of the digital age was the ENIAC, short for Electronic Numerical Integrator And Computer, perhaps because the story associated with the development was one worthy for tabloids and television.

As World War II was coming to a close, the Army had run short of mathematicians and were willing to recruit women. Six women were accepted to work on "Project PX" at the University of Pennsylvania's Moore School of Electrical Engineering, under John Mauchly and J. Presper Eckert. The women's job was to program firing tables and ballistic trajectories using ENIAC.

Their work laid the groundwork for programming. The completed machine was unveiled on Feb. 14, 1946 at the University of Pennsylvania. The military had funded the cost of almost $500,000. It occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. It is widely considered to be the first computer invented, considering its highly functional status through the late 1950s.

However, its "first" status was challenged in court when Rand Corp. bought the ENIAC patent and started charging royalties. Honeywell Inc. refused to pay and challenged the patent in 1967. It was learned that Mauchly, one of the leaders of the Project PX at the University of Pennsylvania, had seen an early prototype of a device being built at the Iowa State College called the Atanasoff-Berry Computer.

Professor John Vincent Atanasoff and graduate student Cliff Berry began development on The ABC in 1937 and it continued to be developed until 1942 at the Iowa State College (now Iowa State University). Eventually, it could solve equations containing 29 variables.
In 1973, U.S. Federal Judge Earl R. Larson released his decision that the ENIAC patent by Mauchly and Eckert was invalid and the ABC was actually the first computer invented. However, the ABC was never fully functional, so the popular opinion to this day has the ENIAC as the first electronic computing device.

The Smithsonian Institute's Museum of American History in Washington displays most of what remains of the ENIAC, alongside bits of the ABC.
However, there's another twist to this tale. The most basic computer is an electronic device designed to accept data, perform prescribed mathematical and logical operations and display the results. Germany's Konrad Zuse created what was essentially the first programmable calculator in the mid-1930s in his parent's living room. Zuse's Z1 had 64-word memory and a clock speed of 1 Hz. Programming the the Z1 required the user to insert tape into a punch tape reader and then receive his results through a punch tape dispenser - making it possibly the first computer invented.

Invention IDs Computer Users By Typing Patterns

The graduate student, Joey Rogers, built his master’s thesis around the invention, and Brown got the satisfaction and excitement that go along with being the first person to discover something. The pair had, however, gotten little else.

“This patent had earned me two free lunches,” Brown quipped recently while sitting in his Houser Hall office within UA’s College of Engineering. “And it probably helped me with tenure.”
The payoff just got a bit more tangible.

Brown and Rogers each recently received checks for approximately $15,700, as their share of the proceeds from the sale of the patent. “The idea that it was something that would pay us was very much unexpected,” Brown said.
Dr. Keith McDowell, vice president for research at UA, said one of the research office’s goals is to raise such faculty expectations, enabling campus researchers to see that intellectual property (new knowledge with commercial applications) created can have multiple payoffs, including financial ones. “Through our technology transfer office (started in October 2004 and directed by Dr. Dan Daly), we are aggressively marketing intellectual property developed by our faculty,” McDowell said. “This can serve as an additional motivator to faculty and, more importantly, it enables The University of Alabama to better fulfill the ‘service to society’ component of its mission.”

A variety of components, including Brown’s childhood readings about a famous inventor, factored into developing the concept leading to the patent.

“I remembered, as a kid, reading about Thomas Edison – who among other things, was a telegraph operator – and that good telegraph operators could tell who was on the other side of the wire based on his exact patterns of dots and dashes,” Brown recalled.

That early lesson in Morse code, in combination with some research Brown was exposed to while in graduate school at Texas A&M University, and others’ comments about recognizing individual typists based on their keyboard’s sounds, sparked the idea.

“All of these were sort of grist for the mill,” Brown said.
The invention enables any typical computer workstation, using a standard keyboard, to distinguish a computer user by the way they type their name.
“If you typed my name at a computer running my invention, the computer would be able to determine that you are not me,” Brown said. An obvious application for the technology is to improve information security.

“Rather than replace passwords, this technology would probably best be used to add another layer of authentication,” Brown said. “It could reduce the need for measures such as changing your password every six weeks.”

Most information security is “brittle,” Brown said, and companies are looking for ways to protect themselves and their clients from unauthorized access to sensitive information.
Under traditional brittle approaches, “If you get my password, there is not much else I can do,” Brown said. Systems using the UA invention would have an added security layer.

Brown and Rogers trained a neural network, a type of computer program which “learns” by example, using the precise time that each key is pressed and released by its user. Measured precisely enough, each person’s typing pattern is a “fingerprint” of sorts, unique to them.
Brown said he’s unsure if this uniqueness is related to the exact physical structure of individuals’ hands, or the way individuals break up words, mentally, when they type them, or, perhaps, some combination of the two along with other unknown factors.

Regardless, Brown said it’s gratifying to see the invention have the opportunity to benefit others. “It’s something brand new, and it’s really an exciting thing to see new ideas open up that can make a difference in someone’s life.”

PERSONAL COMPUTER

Personal Computers, microcomputers were made possible by two technical innovations in the field of microelectronics: the integrated circuit, or IC, which was developed in 1959; and the microprocessor, which first appeared in 1971. The IC permitted the miniaturization of computer-memory circuits, and the microprocessor reduced the size of a computer's CPU to the size of a single silicon chip.
The invention of the microprocessor, a machine which combines the equivalent of thousands of transistors on a single, tiny silicon chip, was developed by Ted Hoff at Intel Corporation in the Santa Clara Valley south of San Francisco, California, an area that was destined to become known to the world as Silicon Valley because of the microprocessor and computer industry that grew up there. Because a CPU calculates, performs logical operations, contains operating instructions, and manages data flows, the potential existed for developing a separate system that could function as a complete microcomputer.

The first such desktop-size system specifically designed for personal use appeared in 1974; it was offered by Micro Instrumentation Telemetry Systems (MITS). The owners of the system were then encouraged by the editor of a popular technology magazine to create and sell a mail-order computer kit through the magazine. The computer, which was called Altair, retailed for slightly less than $400.
The demand for the microcomputer kit was immediate, unexpected, and totally overwhelming. Scores of small entrepreneurial companies responded to this demand by producing computers for the new market. The first major electronics firm to manufacture and sell personal computers, Tandy Corporation (Radio Shack), introduced its model in 1977. It quickly dominated the field, because of the combination of two attractive features: a keyboard and a cathode-ray display terminal (CRT). It was also popular because it could be programmed and the user was able to store information by means of cassette tape.



The invention of the computer

There is not just one inventor of the computer, as the ideas of many scientists and engineers led to its invention. These ideas were developed in the 1930s and 1940s, mostly independently of each other, in Germany, Great Britain and the USA, and were turned into working machines.

In Germany, Konrad Zuse hit upon the idea of building a program-controlled calculating machine when he had to deal with extensive calculations in statics. In 1935, he began to design a program-controlled calculating machine in his parents' home in Berlin. It was based on the binary system and used punched tape for the program input. The Z1, which was built between 1936 and 1938, was a purely mechanical machine which was not fully operational. In 1940, Zuse began to build a successor to the Z1 based on relay technology. In May 1941, he finished the Z3 - worldwide the first freely programmable program-controlled automatic calculator that was operational.

Pakistan Computer Bureau

The Government of Pakistan is taking several steps to put Pakistan on the Information Technology super highway by introducing information technology at all levels in Public sector. E-Government is one of the major initiatives under the present IT Policy and Action Plan. To pursue this initiative, the Government needs to equip its work force with the tools of I.T. The mandate of Pakistan Computer Bureau is to work in the area of E-Government, undertake training of Government Officers and Staff, help in the development, operations and maintenance of IT system in Government Departments, evaluation of feasibility studies and provide consultation on IT related issues. The PCB is performing E-Government activities since 1971 and its major thrust has been in the said area. However, the extent to which computers are effectively employed in different Departments depends on the proper training of officers and staff.


Besides enhancing workforce of trained I. T. personnel, Pakistan Computer Bureau is also aware of its responsibilities for addressing copyright and licensing problems for the computer software being used in Pakistan. With the implementation of W.T.O. regulations it is expected that the use of licensed software will become mandatory. This will have a great impact on the expenditure of the users as well as a great burden on the national exchequer.
The Government is promoting I. T. by encouraging the development and use of Open Source Software for creating/preparing a workforce possessing the ability to earn valuable foreign exchange through the export of software and trained workforce by itself. The implementation and use of software in the domestic organizations (Public and Private) promotes efficiency and productivity. Without a strict enforcement of copyright laws, there is no incentive for the local software developers. Apart from the enforcement of copyright laws, it is also the duty of the Government to provide alternate solutions of I. T ware to the individuals and organizations. The use of the Open Source Software is being promoted keeping this aspect under consideration. The focus is on Linux Operating System, Open Office and Linux Databases e.g. MySQL or Postgre SQL. The Government of Pakistan is not the only Government promoting the use of Open Source Software. A large number of countries and Governments around the world are promoting the use of Open Source Software. Among these include Germany, France, India, South Africa, U.K. and even USA.