Submitted by: Tony Mack

There are several projects that you do around the house where a superior finish really does make a difference; having a 16 gauge Paslode finish nailer around; you can easily get that high quality appearance with your projects at home.

Just imagine it for a minute; you’ve now chosen to move house; you have a new one all picked out, and you’ve put your home on the market today; hoping to get the best price for it. You have looked around your current home, and recognize that one or two small tidying up tasks that need to be done to ensure it is completely ready for the viewings. No hassle, you think to yourself, and you set about knocking things into shape; or so you believe.

You try to re-attach the wood paneling which you removed, a long time ago for some reason, and find the nails you have simply won’t be suitable. No worries; get bigger nails, correct? That’s okay, but they aren’t the most amazing finish you require, and you discover that you’re having problems holding the nails and the board at the same time, and trying to hammer is virtually impossible. In addition the dings and dents from the hammer leaves in the paneling looking rather unattractive.

Can there be a different option? Surely there is.

YouTube Preview Image

After that you attempt gluing them on. Everthing seems to go fine, until you hear the thud at night, and recognize that the discount bin adhesive you used simply isn’t able to keep everything in place.

What’s the next solution?

For most people it may be to look at the knocked down offer they have placed on your house; mainly because that they have argued that it ‘has work that needs to be done’. You’ve invested all of that time having the remaining house fixed up, and so they attempt to knock 1000’s off of the value as there are several bits of not so great Do it yourself. However, you just have to accept that; right?

Of course you don’t; use a Paslode finish nailer 16 gauge to get the job done correctly, and save yourself 1000’s. The Paslode nail gun is a handy little time a saver which has been developed with you in mind. They are simple to operate; don’t make a complete mess of the project you happen to be working on; and operate just about anywhere, not like some of the people you might have considered hiring to complete the job for you.

How come they ‘work anywhere’?

The Paslode finish nailer 16 gauge is cordless, so you’re able to go to areas of the property where you don’t have a power supply, and still complete those tasks i.e. repairing the fence right at the end of the garden. Not only making it easier, in addition, it is less dangerous; you will not spend half of your time and efforts yanking hoses behind you, and worrying when you are about to stumble over them.

Something else that makes the Paslode nail gun the coolest tool in your armoury is the fact that it comes in two types. One straight and one angled, which actually makes the Paslode finish nailer a extremely flexible power tool.

Have you ever had a project that will need you to nail something in to the corner of a room?

Have you ever been able to get it done without spending at least 3 hours working out how you are going to do it; then you still make a complete mess of it? Well, with the Paslode finish nailer 16 gauge you will be slamming those nails in, and wondering what all of the fuss was about.

So, before the holidays comes around, and you just understand you will be expected to do those ‘small jobs’ around the house, do the smart thing, get a 16 gauge Paslode finish nailer and save yourself time and money; then you could have the rest of your time off doing the things you want to do; well, if your allowed to, that is.

About the Author: Go to

paslodereview.com/products/paslode-finish-nailers

to obtain the full picture prior to buying the 16 Gauge Paslode Finish nailer to repair things up at home.

Source:

isnare.com

Permanent Link:

isnare.com/?aid=841673&ca=Home+Management

Former U.S. governor Palin signs TV deal with Fox News

Posted May 16th, 2019 by F4GKcb4Y

Tuesday, January 12, 2010

According to a report by the New York Times that has now been confirmed by Fox News, former United States governor of Alaska and 2008 Republican vice-presidential nominee Sarah Palin has signed a multi-year deal to work as a news contributor for American television station Fox News. The deal is effective immediately, and Palin will reportedly contribute regularly to all Fox stations. The monetary terms to the deal have not yet been disclosed, but the agreement is rumored to be for three years.

“Governor Palin has captivated everyone on both sides of the political spectrum…,” said Bill Shine, Fox News’s vice president of programing,”We are excited to add her dynamic voice to the Fox News lineup,” he added in his comments.

HAVE YOUR SAY
Does Fox News contain fair and balanced news as Sarah Palin suggests?
Add or view comments

As well as contributing, Palin will also be hosting a new program slated to air on Fox News. Called Real American Stories, the series will portray stories of overcoming obstacles during current times and through current social or economic problems that may be going on at the time. The series has no planned release date, but will air sometime within 2010.

Palin released a statement herself Monday afternoon. “I am thrilled to be joining the great talent and management team at Fox News. It’s wonderful to be part of a place that so values fair and balanced news,” she said.

After serving as mayor of the town of Wasilla, Palin was elected governor of Alaska in 2006. U.S. Republican candidate John McCain picked Palin to serve as his running mate in the 2008 U.S. Presidential election, where she and McCain lost to Democrat Barack Obama and his running mate Joe Biden. In July 2009, she resigned from the governor position of Alaska governor, possibly opening up a run for U.S. president in 2012.

Wednesday, March 19, 2008

The United Arab Emirates (UAE) has announced its first national authority for scientific research (NASR) to coordinate and fund scientific research in the country.

The national authority for scientific research was announced on March 7 by Shaikh Nahyan bin Mubarak Al Nahyan, UAE minister for higher education and scientific research. NASR will begin with an annual budget of AED100 million (approximately US$27.2 million). The authority hopes to receive additional contributions from the public and the private sector.

NASR will look to fund research projects in various fields, including engineering, technology, medicine, water and agriculture, proposing specific projects to be competed for by researchers at universities and private research institutes.

“Projects are going to be selected to help promote scientific research and the growth of UAE society and we will compare them with international scientific research criteria,” Gulf News quoted Al Nahyan as saying at the launch.

NASR will also train scientists and develop programmes for promoting public science awareness. It will also coordinate with government authorities on the issue of intellectual property rights, by providing advice on how companies and research centres should go about protecting their discoveries in the form of patents or licenses. It will also provide scholarships for researchers in the UAE to work on international research programmes, and organise national scientific conferences. NASR forms part of the UAE’s strategic plan to improve higher education and scientific research.

Zakaria Maamar, associate professor at the College of Information Technology at Zayed University, UAE, told Science and Development Network (SciDev.Net) that, “This initiative is another boost to the research and development activities that are carried out in the UAE. It will definitely provide researchers with the appropriate funds to sustain such activities and promote best practices in the community.”

Said Elnaffar, assistant professor at the college of information technology at the United Arab Emirates University, told SciDev.Net that, with this initiative, the UAE is taking the lead and setting a good example by building a strong development infrastructure founded on knowledge discovery and research.


This article is based on UAE launches national authority for scientific research by scidev.net (Wagdy Sawahel) which has a copyright policy compatible with our CC-BY 2.5. Specifically “CC-BY-2.0 UK

Sunday, May 28, 2006

Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.

Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.

Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?

Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first US “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.

Is the video available to the public?

Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So

We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.

Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.

How many samples do you anticipate being found during the course of the project?

Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”

How big would the samples be?

We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.

And that’s on the main Stardust@home website [see below]?

Yes.

How long will the project take to complete?

Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!

And this is the huge advantage with this kind of a mission — a “sample return” mission.

Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!

When do you anticipate the project to start?

We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.

And rest assured that we’re just as frustrated!

I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?

The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!

Why did NASA decide to take the route of distributed computing? Will they do this again?

I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…

If I understand correctly it isn’t distributed computing, but distributed eyeballing?

…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).

That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen

Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?

That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?

Will a project like this be done again?

I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.

How did the idea come up to do this kind of project?

Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)

I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?

That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.

Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?

These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!

I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?

Can you elaborate on your question a little — I’m not sure that I understand…

Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?

That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.

I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.

I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!

Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?

The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.

We have some fun things, including micromachines.

How many people/participants do you expect to have?

About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!

One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!

Tuesday, May 15, 2007

Software giant Microsoft’s chief lawyer Brad Smith claimed in an interview published in the magazine Fortune on Monday that open-source software products violate 235 of Microsoft’s patents. The main transgressors are claimed to be Linux (107 patents) and OpenOffice.org (45), with e-mail programs infringing 15 patents. Microsoft wants royalties to compensate for the patent breaches.

According to Microsoft’s Vice-President of intellectual property and licensing, Horacio Gutierrez, the company wants to negotiate with the open-source companies rather than sue them. “If we wanted to litigate we would have done that a long time ago. Litigation is not an effective way of going about solutions,” Gutierrez said. According to him, Microsoft has over the last years tried to work towards a “constructive” solution to the alleged problem of patent violation.

Microsoft in the past has used the strategy of cross-licensing to get royalties from companies who infringe their patents, for example in their deal with Novell. On a company blog, Novell reiterated that their deal “is in no way an acknowledgment that Linux infringes upon any Microsoft intellectual property.”

“We don’t think that customers will want to continue on without a solution to the problem,” Gutierrez said about Microsoft’s approach to guaranteeing companies that they won’t get sued because they use the allegedly patent-infringing Linux operating system.

The upcoming third version of the GPL licence, the licence under which Linux is released, will prohibit Linux distributors to agree to patent royalty deals. Microsoft called these “attempts to tear down the bridge between proprietary and open-source software that Microsoft has worked to build with the industry and customers.”

A related U.S. Supreme Court ruling from April 30th showed how software patents can be subject to court challenges; basically, if the innovations patented are “obvious”, the patent is weakened. Joe Lindsay, information officer for a mortgage company, pointed out that the Unix code that Linux is based upon preceded Microsoft Windows, which might also be a reason for some patents to be invalid.

Red Hat, the biggest Linux distributor, said in a statement on Monday:

The reality is that the community development approach of free and open source code represents a healthy development paradigm, which, when viewed from the perspective of pending lawsuits related to intellectual property, is at least as safe as proprietary software.
 

Larry Augustin, former CEO of a company called VA Linux (now VA Software), responsible among other things for launching SourceForge.net, an open-source software development community, posted a message on his blog under the title “It’s Time for Microsoft to Put Up or Shut Up”:

If Microsoft believes that Free and Open Source Software violates any of their patents, let them put those patents forward now, in the light of day, where we can all evaluate them on their merits. If not, then stop trying to bully customers into paying royalties to use Open Source.

According to the Fortune report, more than half of the Fortune 500 companies are estimated to use Linux in their data centers.

Five jailed in Tyler, Texas following robbery and scam

Posted May 10th, 2019 by F4GKcb4Y

Friday, November 14, 2014

Four men and one woman were in jail in Tyler, Texas as of yesterday morning following a robbery allegedly involving an elaborate scam.

Police alleged Alneisha Butler originated a scheme to lure a 25-year-old man to an apartment complex in Tyler late Tuesday night, where he was accosted at gun point by four men. Butler and the man had been corresponding online for a few weeks, when she told him she wanted him to meet some friends of hers, according to police.

When the two met at the complex, they were met by the four men who demanded money from the victim. He didn’t have cash, so the assailants ordered him to get cash from an ATM (Automated Teller Machine) while they held the woman until his return. The man left the scene and notified police by phone. The men named as alleged assailants in the crime were Lawrence Caston, Shannon Howard, Justin McGee, and David Roberts.

The woman and each of the four men were being held on a US$750,000 bond. A police spokesperson cautioned, “This could have [just as] easily happened if he met her at a bar as it happened online. If you go somewhere with someone and they start leading you to a place you don’t feel comfortable, then don’t go”.

New Zealand medical student funding to be reviewed

Posted May 10th, 2019 by F4GKcb4Y

Monday, February 20, 2006

The New Zealand government has announced that it will be reviewing funding for medical and dentistry students at Otago and Auckland Universities to certify the institutions’ standards and help staff retention.

The dean of Auckland University’s Faculty of Medical and Health Sciences, Professor Iain Martin says the review “can’t come soon enough”.

The Medical Students Association welcomes the review. It says that it has been worried about student debt for years “High debt encourages too many graduates overseas, or into high paying areas of practice at the expense of areas like general practice”

U.S., Mexico and U.K. top medalists at RoboGames 2009

Posted May 10th, 2019 by F4GKcb4Y

Wednesday, June 24, 2009

The 6th annual RoboGames, a robot competition that takes place in the United States, was held this month with 403 robots from 18 different parts of the globe competing in the categories of combat, sumo, robo-one/androids, open, hockey, art bots and junior league.

This year’s medals went to the United States, Mexico, the United Kingdom, Brazil, India, Canada, Indonesia, South Korea and Russia (see table below). Other participants were Australia, Austria, Colombia, Egypt, Hong Kong, Iran, Japan, Peru and Taiwan.

RoboGames 2009
Flag Country Gold Silver Bronze Total
United States 31 30 24 85
Mexico 2 3 4 9
United Kingdom 6 2 0 8
Brazil 2 3 2 7
India 0 1 2 3
Canada 1 0 1 2
Indonesia 1 0 0 1
South Korea 1 0 0 1
Russia 1 0 0 1

Although the host country carried off most of the medals, visiting countries stood out at some events, like Mexico at 1 lb autonomous combat, 500 g autonomous sumo, 100 g autonomous sumo and autonomous line follower, Brazil at 3kg sumo (both autonomous and radio-controlled), and the United Kingdom in the “best of show” and “walker challenge” modalities.

“Taking part gave us the opportunity to test our knowledge against students from other nations, and proved that we are at the same level or better than other students from famous schools”, said Mexico’s National Polytechnic Institute student Erick Rodríguez who, along with his fellow team member Rogelio Baeza, took gold in the autonomous line follower event.

RoboGames, previously ROBOlympics, holds the Guinness Record for “world’s largest robot competition”. It was founded in 2004 by David Calkins to help robot builders exchange ideas and learn from each other.

Man attacks people, kills 7 in Akihabara, Tokyo

Posted May 9th, 2019 by F4GKcb4Y

Sunday, June 8, 2008

A lively shopping district was suddenly struck by an indiscriminate murder spree on a Sunday afternoon. A young man attacked a crowd with a rented truck, and then stabbed people with a dagger, around 12:35 JST (3:35 UTC) on Sunday in the Akihabara district of Chiyoda Ward, Tokyo. The suspect, identified as Tomohiro Kat?, aged 25, was arrested on suspicion of attempted murder. Seven people were killed, and ten injured.

The injured people include pedestrians, a 53-year-old policeman, and a taxi driver, 54, who is believed to have left his taxi in order to help a pedestrian hit by the truck. About 17 ambulances came to the scene. NHK reports that medical teams trained for disasters joined the rescue.

The dead are six men aged between 19 and 74 and a 21-year-old woman; each was identified by police.

The truck, traveling on Kanda My?jin-d?ri (Kanda My?jin Avenue), ran into the crowd at a crossing on Ch??-d?ri (Ch?? Avenue), which is a vehicle-free road every Sunday. After entering the crowd, the truck continued moving for about 30 meters. The suspect then exited the truck and began stabbing people. According to a witness, the suspect, upon being pursued by police, ran south on Ch??-d?ri and was cornered in a narrow alley. The suspect dropped his knife after the police officer drew his gun, and was overpowered by the officer and several bystanders.

The suspect has been quoted as confessing to police, “I came to Akihabara to kill people. I’m tired of the world. Anyone was OK. Today I came alone.” The suspect, a native of Aomori City, lived and worked as a dispatched temporary employee in Shizuoka Prefecture. After being held at the Manseibashi Police Station, he was sent on Tuesday to the Tokyo District Public Prosecutors Office.

At the scene, a table covered with a white cloth was set for memorial flowers and other offerings on the morning of Monday, June 9. As acquaintances of the dead and passersby visited, the table soon filled—almost flooded—with flowers, drink bottles, pictures, and folded paper cranes.

The incident occurred on the same date with the 2001 Osaka school massacre, in which eight elementary school students were killed.

NHK reported the incident as confirmed within 45 minutes, interrupting a news channel program for one minute. Some newspapers created special issues to report the incident as June 8 was a Sunday and also a press holiday, many newspapers in Japan were not scheduled for delivery on that evening or the following morning.

The Akihabara district is famous for its Electric Town with wholesalers and retailers of video games or electric gadgets such as PC parts, televisions and kitchen appliances. It has also become famous for distinctive trends in cultures created or supported by the youth.

Local residents, business owners, and government officials from Chiyoda Ward assembled for an emergency meeting on Monday to discuss whether the practice of maintaining the vehicle-free area on Sundays ought to be continued. The Tokyo Metropolitan Public Safety Commission is also expected to discuss the issue.

Sunday, September 23, 2012

Strathclyde Police have arrested a 19-year-old man in relation to a reported incident of a boy, aged four, being sexually assaulted in the toilet of an Asda supermarket in the town of Clydebank in West Dunbartonshire, Scotland. The suspect is being held in police custody and has an appearance scheduled for Tuesday at Dumbarton Sheriff Court.

Police were initially alerted to the incident at 31 Britannia Way in Clydebank approximately 1355 BST (1255 UTC) Tuesday. According to Sky News, the boy’s mother had given him permission to enter the toilet by himself as she waited outside. Police said the incident took place in the minutes following the boy’s entrance into the toilet; upon his departure, the boy raised the alert.

“This is an isolated incident, nevertheless, one that has caused significant stress to the young child and his family,” said Detective Inspector Graham Cordner, who said the child was not injured and is at home with his family.

Police said that they had taken one whole day to interview the child and an additional day to have initial investigations into the incident. All supermarket staff have been questioned and CCTV video has been examined.

“We have taken this report very seriously”, said a spokeswoman for Asda. “We alerted the police and are supporting them fully in their investigation.”