Thomas Hawk posted a photo:
The Montecito was built in 1935 with 95 units at a cost of $1 million. Set on a hill overlooking the city, the Montecito is the highest building in Hollywood. It has a private swimming pool, two subterranean garages and a parking lot.The building is a classic Art Deco design with Mayan influences and windows arranged in vertical blinds. In 1946, it was sold for $600,000. In 1954, it was sold again, this time by Isadore and Libby Teacher to Howard Fox and Harry Wyatt.
The Montecito was home to several future movie stars, especially New York based actors while working in Hollywood. It was Ronald Reagan’s first residence when he moved to Hollywood; Reagan lived at the Montecito from June 1937 to late 1938. Reagan was said to have been roommates at the Montecito with Mickey Rooney. Other celebrities who have lived at the Montecito include James Cagney, George C. Scott, Montgomery Clift, Geraldine Page, Don Johnson, Sal Mineo and Ben Vereen.
DEN HAAG (ANP) - Een Nederlands persoon is gewond geraakt tijdens de aanslag in het Australische Bondi Beach op zondag. Dat meldt een woordvoerder van het ministerie van Buitenlandse Zaken. De persoon heeft een dubbele nationaliteit; naast de Nederlandse is onbekend welke andere nationaliteit de persoon bezit.
Het ministerie van Buitenlandse Zaken laat weten in contact te staan met het slachtoffer. De persoon is buiten levensgevaar; het ministerie zegt bijstand te verlenen indien daarom gevraagd wordt. Vooralsnog ligt er echter geen hulpverzoek.

Kijk. U hoeft ons (en uzelf) niet uit te leggen hoe het internet werkt. Als je berichten plaatst, komen daar reacties op, en dat zijn best vaak nare reacties, en dat is niet (wij herhalen: niet) de schuld van degene die het oorspronkelijke bericht heeft geplaatst, maar van degene die de reactie plaatst.
MAARRRR... Wij zijn dan ook niet met veel bombarie van X weggegaan vanwege de haatreacties aldaar, inclusief een optreden van de hoofdredacteur bij Eva over hoe verschrikkelijk een tweetje van Mona Keijzer was, en wel gewoon op TikTok gebleven zoals de NOS. En EEN DAG nadat daar het nieuws over de aanslag op Bondi Beach wordt geplaatst, staan de comments ramvol. Zoals Nasje[watermloen emoji] het treffend samenvat: "The comment section did not dissapoint" (188 likes). Kleine bloemlezing na de breek.

















Veel vrouwen voelen zich onveilig tijdens het hardlopen. Dat leidt tot ‘zelfdisciplinering’. Zonder ruggensteun zal dat niet veranderen.
Dit is mogelijk de laatste jaarwisseling waarin consumenten vuurwerk mogen afsteken. Het verbod gaat in als drie voorwaarden ook zijn uitgevoerd, waaronder een ontheffing voor vuurwerkclubs.
Jaguar Land Rover (JLR) has reportedly told staff the cyber raid that crippled its operations in August didn't just bring production to a screeching halt – it also walked off with the personal payroll data of thousands of employees.…
Cast your mind back to May of this year: Congress was in the throes of debate over the massive budget bill. Amidst the many seismic provisions, Senator Ted Cruz dropped a ticking time bomb of tech policy: a ten-year moratorium on the ability of states to regulate artificial intelligence. To many, this was catastrophic. The few massive AI companies seem to be swallowing our economy whole: their energy demands are overriding household needs, their data demands are overriding creators’ copyright, and their products are triggering mass unemployment as well as new types of clinical psychoses. In a moment where Congress is seemingly unable to act to pass any meaningful consumer protections or market regulations, why would we hamstring the one entity evidently capable of doing so—the states? States that have already enacted consumer protections and other AI regulations, like California, and those actively debating them, like Massachusetts, were alarmed. Seventeen Republican governors wrote a letter decrying the idea, and it was ultimately killed in a rare vote of bipartisan near-unanimity.
The idea is back. Before Thanksgiving, a House Republican leader suggested they might slip it into the annual defense spending bill. Then, a draft document leaked outlining the Trump administration’s intent to enforce the state regulatory ban through executive powers. An outpouring of opposition (including from some Republican state leaders) beat back that notion for a few weeks, but on Monday, Trump posted on social media that the promised Executive Order is indeed coming soon. That would put a growing cohort of states, including California and New York, as well as Republican strongholds like Utah and Texas, in jeopardy.
The constellation of motivations behind this proposal is clear: conservative ideology, cash, and China.
The intellectual argument in favor of the moratorium is that “freedom“-killing state regulation on AI would create a patchwork that would be difficult for AI companies to comply with, which would slow the pace of innovation needed to win an AI arms race with China. AI companies and their investors have been aggressively peddling this narrative for years now, and are increasingly backing it with exorbitant lobbying dollars. It’s a handy argument, useful not only to kill regulatory constraints, but also—companies hope—to win federal bailouts and energy subsidies.
Citizens should parse that argument from their own point of view, not Big Tech’s. Preventing states from regulating AI means that those companies get to tell Washington what they want, but your state representatives are powerless to represent your own interests. Which freedom is more important to you: the freedom for a few near-monopolies to profit from AI, or the freedom for you and your neighbors to demand protections from its abuses?
There is an element of this that is more partisan than ideological. Vice President J.D. Vance argued that federal preemption is needed to prevent “progressive” states from controlling AI’s future. This is an indicator of creeping polarization, where Democrats decry the monopolism, bias, and harms attendant to corporate AI and Republicans reflexively take the opposite side. It doesn’t help that some in the parties also have direct financial interests in the AI supply chain.
But this does not need to be a partisan wedge issue: both Democrats and Republicans have strong reasons to support state-level AI legislation. Everyone shares an interest in protecting consumers from harm created by Big Tech companies. In leading the charge to kill Cruz’s initial AI moratorium proposal, Republican Senator Masha Blackburn explained that “This provision could allow Big Tech to continue to exploit kids, creators, and conservatives? we can’t block states from making laws that protect their citizens.” More recently, Florida Governor Ron DeSantis wants to regulate AI in his state.
The often-heard complaint that it is hard to comply with a patchwork of state regulations rings hollow. Pretty much every other consumer-facing industry has managed to deal with local regulation—automobiles, children’s toys, food, and drugs—and those regulations have been effective consumer protections. The AI industry includes some of the most valuable companies globally and has demonstrated the ability to comply with differing regulations around the world, including the EU’s AI and data privacy regulations, substantially more onerous than those so far adopted by US states. If we can’t leverage state regulatory power to shape the AI industry, to what industry could it possibly apply?
The regulatory superpower that states have here is not size and force, but rather speed and locality. We need the “laboratories of democracy” to experiment with different types of regulation that fit the specific needs and interests of their constituents and evolve responsively to the concerns they raise, especially in such a consequential and rapidly changing area such as AI.
We should embrace the ability of regulation to be a driver—not a limiter—of innovation. Regulations don’t restrict companies from building better products or making more profit; they help channel that innovation in specific ways that protect the public interest. Drug safety regulations don’t prevent pharma companies from inventing drugs; they force them to invent drugs that are safe and efficacious. States can direct private innovation to serve the public.
But, most importantly, regulations are needed to prevent the most dangerous impact of AI today: the concentration of power associated with trillion-dollar AI companies and the power-amplifying technologies they are producing. We outline the specific ways that the use of AI in governance can disrupt existing balances of power, and how to steer those applications towards more equitable balances, in our new book, Rewiring Democracy. In the nearly complete absence of Congressional action on AI over the years, it has swept the world’s attention; it has become clear that states are the only effective policy levers we have against that concentration of power.
Instead of impeding states from regulating AI, the federal government should support them to drive AI innovation. If proponents of a moratorium worry that the private sector won’t deliver what they think is needed to compete in the new global economy, then we should engage government to help generate AI innovations that serve the public and solve the problems most important to people. Following the lead of countries like Switzerland, France, and Singapore, the US could invest in developing and deploying AI models designed as public goods: transparent, open, and useful for tasks in public administration and governance.
Maybe you don’t trust the federal government to build or operate an AI tool that acts in the public interest? We don’t either. States are a much better place for this innovation to happen because they are closer to the people, they are charged with delivering most government services, they are better aligned with local political sentiments, and they have achieved greater trust. They’re where we can test, iterate, compare, and contrast regulatory approaches that could inform eventual and better federal policy. And, while the costs of training and operating performance AI tools like large language models have declined precipitously, the federal government can play a valuable role here in funding cash-strapped states to lead this kind of innovation.
This essay was written with Nathan E. Sanders, and originally appeared in Gizmodo.
EDITED TO ADD: Trump signed an executive order banning state-level AI regulations hours after this was published. This is not going to be the last word on the subject.