A hacker, or group of hackers, operating under the alias of The Dark Overlord uploaded ten episodes of Netflix’s web TV series Orange is the New Black on Friday and Saturday on The Pirate Bay after they said the online streaming service failed to meet their demands. Netflix had planned to release the season on June 9.
According to The New York Times, the unreleased content from the upcoming fifth season of Orange is the New Black was likely stolen from a postproduction company Larson Studios, based in Los Angeles. Netflix in a statement said, “A production vendor used by several major TV studios had its security compromised and the appropriate law enforcement authorities are involved.” In a tweet on Saturday, the hacker said, “Who is next on the list? FOX, IFC, NAT GEO, and ABC. Oh, what fun we’re all going to have. We’re not playing any games anymore.”
The hacker tweeted about uploading the first episode on The Pirate Bay on Saturday saying, “Let’s try to be a bit more direct, Netflix”. The hacker allegedly demanded an amount of money which they publicly described as “modest”, from Netflix for not releasing the episodes prematurely. The New York Times reported that the final three episodes were not pirated since the security breach occurred before the postproduction studio was handed those episodes. In January, the hacker erased the data from the servers of a Muncie-based charity called Little Red Door Cancer Services of East Central Indiana demanding 50 bitcoins to restore their data, which was estimated to be about US$43,000.
The Federal Bureau of Investigation is reportedly investigating this cyber crime. Netflix has more than 100 million subscribers, CEO Reed Hastings announced recently. Variety noted that Netfilx’s shares experienced a 0.57% loss on the day the first episode was uploaded by the hacker.
On November 13, Torontonians will be heading to the polls to vote for their ward’s councillor and for mayor. Among Toronto’s ridings is Don Valley West (Ward 25). Three candidates responded to Wikinews’ requests for an interview. This ward’s candidates include John Blair, Robertson Boyle, Tony Dickins, Cliff Jenkins (incumbent), and Peter Kapsalis.
Pennsylvania governor Ed Rendell announced Tuesday morning that a deal had been struck between state and local officials and the Pittsburgh Penguins hockey franchise. The Penguins organization will formally announce the deal tonight, prior to the Penguins game against the Buffalo Sabres at the Mellon Arena. The deal will ensure that the Penguins will remain in the city with a 30 year lease on a new arena to be built in downtown Pittsburgh. The framework of the deal was constructed in an emergency meeting last Thursday in Philadelphia, when both government and franchise officials indicated that progress had been made, with the details laid out over the weekend. With the new deal, the Penguins organization would be expected to pay $3.8 million per year, as well as $7.5 million per year from both Don Barden, owner of Majestic Star Casino, and the state economic development fund. The Penguins organization has also been given the option of building a parking garage on property of the Pittsburgh Sports Authority between Centre and Fifth avenues, by contributing $500,000 per year.
The new arena is expected to cost approximately $290 million, and should be completed and ready to host hockey games by 2009. The Penguins will sign a temporary lease to keep the team at Mellon Arena until the new building is finished.
Yesterday, the German Bundestag passed a law to legalise cannabis drug for medicinal purposes. The law is to come under effect in March.
“Seriously ill people must be treated in the best ways possible” ((de))German language: ?Schwerkranke Menschen müssen bestmöglich versorgt werden., German health minister Hermann Gröhe tweeted. Doctors can prescribe marijuana — cannabis — for patients suffering from multiple sclerosis, chronic pain, or loss of appetite or nausea from cancer’s chemotherapy treatment.
Christian Democrats (CDU) lawmaker Rainer Hayek said this law would still prevent recreational use of cannabis. The cost of cannabis is to be covered under health insurance. Patients can buy dried buds or cannabis extracts from pharmacies with a prescription or get synthetic derivatives from other countries, though possession of the drug in large quantities is not allowed.
Cannabis cultivation is to be monitored by the government. Germany has joined other European countries such as Austria, Spain, France, Italy, Portugal and Netherlands in legalising the drug to some extent.
In October, a 53-year-old multiple sclerosis patient showed cannabis was the only solution to reduce his pain, and the court granted him permission to grow as many as 130 plants in one year for personal use. Purchasing, rather than growing, medical cannabis at the time cost about €15 (US$16.85) per gram.
The San Diego, California suburb of Chula Vista has responded to the recent housing crisis with an aggressive blight control ordinance that compels lenders to maintain the appearance of vacant homes. As foreclosures increase both locally and throughout the United States, the one year old ordinance has become a model for other cities overwhelmed by the problem of abandoned homes that decay into neighborhood eyesores.
Chula Vista city code enforcement manager Doug Leeper told the San Diego Union Tribune that over 300 jurisdictions have contacted his office during the past year with inquiries about the city’s tough local ordinance. Coral Springs, Florida, and California towns Stockton, Santee, Riverside County, and Murietta have all modeled recently enacted anti-blight measures after Chula Vista’s. On Wednesday, 8 October, the Escondido City Council also voted to tighten local measures making lenders more accountable for maintenance of empty homes.
Lenders will respond when it costs them less to maintain the property than to ignore local agency requirements.
Under the Chula Vista ordinance lenders become legally responsible for upkeep as soon as a notice of mortgage default gets filed on a vacant dwelling, before actual ownership of the dwelling returns to the lender. Leeper regards that as “the cutting-edge part of our ordinance”. Chula Vista also requires prompt registration of vacant homes and applies stiff fines as high as US$1000 per day for failure to maintain a property. Since foreclosed properties are subject to frequent resale between mortgage brokers, city officials enforce the fines by sending notices to every name on title documents and placing a lien on the property, which prevents further resale until outstanding fines have been paid. In the year since the ordinance went into effect the city has applied $850,000 in fines and penalties, of which it has collected $200,000 to date. The city has collected an additional $77,000 in registration fees on vacant homes.
Jolie Houston, an attorney in San Jose, believes “Lenders will respond when it costs them less to maintain the property than to ignore local agency requirements.” Traditionally, local governments have resorted to addressing blight problems on abandoned properties with public funds, mowing overgrown lawns and performing other vital functions, then seeking repayment afterward. Chula Vista has moved that responsibility to an upfront obligation upon lenders.
That kind of measure will add additional costs to banks that have been hit really hard already and ultimately the cost will be transferred down to consumers and investors.
As one of the fastest growing cities in the United States during recent years, Chula Vista saw 22.6% growth between 2000 and 2006, which brought the city’s population from 173,556 in the 2000 census to an estimated 212,756, according to the U.S. Census Bureau. Chula Vista placed among the nation’s 20 fastest growing cities in 2004. A large proportion of local homes were purchased during the recent housing boom using creative financing options that purchasers did not understand were beyond their means. Average home prices in San Diego County declined by 25% in the last year, which is the steepest drop on record. Many homeowners in the region currently owe more than their homes are worth and confront rising balloon payment mortgages that they had expected to afford by refinancing new equity that either vanished or never materialized. In August 2008, Chula Vista’s eastern 91913 zip code had the highest home mortgage default rate in the county with 154 filings and 94 foreclosures, an increase of 154% over one year previously. Regionally, the county saw 1,979 foreclosures in August.
Professionals from the real estate and mortgage industries object to Chula Vista’s response to the crisis for the additional burdens it places on their struggling finances. Said San Diego real estate agent Marc Carpenter, “that kind of measure will add additional costs to banks that have been hit really hard already and ultimately the cost will be transferred down to consumers and investors.” Yet city councils in many communities have been under pressure to do something about increasing numbers of vacant properties. Concentrations of abandoned and neglected homes can attract vandals who hasten the decline of struggling neighborhoods. Jolie Houston explained that city officials “can’t fix the lending problem, but they can try to prevent neighborhoods from becoming blighted.”
HAVE YOUR SAY
Does Chula Vista’s solution save neighborhoods or worsen the financial crisis?
Add or view comments
CEO Robert Klein of Safeguard, a property management firm, told the Union Tribune that his industry is having difficulty adapting to the rapidly changing local ordinances. “Every day we discover a new ordinance coming out of somewhere”, he complained. Dustin Hobbs, a spokesman from the California Association of Mortgage Bankers agreed that uneven local ordinances are likely to increase the costs of lending. Hobbs advised that local legislation is unnecessary due to California State Senate Bill 1137, which was recently approved to address blight. Yet according to Houston, the statewide measure falls short because it fails to address upkeep needs during the months between the time when foreclosure begins and when the lender takes title.
Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.
Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.
Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?
Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first [US]] “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.
Is the video available to the public?
Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So…
We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.
Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.
How many samples do you anticipate being found during the course of the project?
Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”
How big would the samples be?
We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.
And that’s on the main Stardust@home website [see below]?
Yes.
How long will the project take to complete?
Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!
And this is the huge advantage with this kind of a mission — a “sample return” mission.
Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!
When do you anticipate the project to start?
We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.
And rest assured that we’re just as frustrated!
I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?
The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!
Why did NASA decide to take the route of distributed computing? Will they do this again?
I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…
If I understand correctly it isn’t distributed computing, but distributed eyeballing?
…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).
That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen
Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?
That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?
Will a project like this be done again?
I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.
How did the idea come up to do this kind of project?
Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)
I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?
That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.
Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?
These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!
I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?
Can you elaborate on your question a little — I’m not sure that I understand…
Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?
That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.
I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.
I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!
Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?
The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.
We have some fun things, including micromachines.
How many people/participants do you expect to have?
About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!
One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!
The Buffalo News is reporting that leaders of unions representing city employees in Buffalo, New York, met on Wednesday to discuss a possible ‘citywide strike.’ If the unions decided on a strike, firefighters, police officers, teachers and many other city employees would not show up for work.
The News is reporting that employees are upset about the wage-freeze which was put into effect when the Control Board took control of the city’s finances 27 months ago. Some employees even call the wage freeze “working class genocide.”
“Genocide of the working class is also illegal,” president of the Police Benevolent Association, Robert P. Meegan Jr., said to reporters when told that a strike would be illegal, after the meeting.
“We don’t expect to see any job actions with the bargaining units directly under the city administration. The fact that their legal counsel wouldn’t let [unions] comment any further speaks for itself,” said mayor of Buffalo, Byron W. Brown in reference to New York state‘s Taylor Law which makes striking illegal.
Members of the Control Board have refused to comment to the press about the option to strike.
A lawsuit has been filed by the Police Benevolent Association‘s union which would allow workers to strike if a judge grants permission to do so.
Pakistani officials said that a suspected US drone attack has killed at least eight people today, including foreign militants, in the country’s northwest.
The officials said that two missiles were fired at a compound being used by suspected Taliban militants in Mir Ali village, in North Waziristan, a region on the Afghanistan-border that is said to be a militant stronghold.
At least eight people were killed in the drone attack.
“At least eight people were killed in the drone attack. A compound used by militants was targeted,” said an unnamed official to the Agence France-Presse (AFP) news agency.
Police official Mohammad Haroon said to the Reuters news agency that “it was a remote-controlled bomb. Two policemen died on the spot, while a third has succumbed to his injuries a short while ago.”
US military officials generally do not confirm attacks such as these, but its armed forces and Central Intelligence Agency employees in Afghanistan are the only forces that deploy pilotless drones in the region.
At least 65 similar drone strikes have killed about 625 people in Pakistan since August of last year, according to AFP.
The USAFL Nationals will feature teams from the United States and Canada. A 50/50 rule is being implemented for the tournament. This means that an American team can have no more than nine players who aren’t Americans and a Canadian team no more than nine non-Canadians.
Australian rules football is played on a field 170 metres by 160 metres. The two teams consist of 18 players a side. Scores are quoted as goals-behinds (total).
According to USFooty, the tournament will attract over 1000 players. The tournament will have four divisions for men and one for woman.