top of page

Search Results

17 items found for ""

  • Resurrecting the Past: How Dead Celebrities Would Look in 2023

    The stunning advancement in artificial intelligence has been truly remarkable, and one of the areas where it has been most transformative is in the field of image generation. One of the leading AI image generators in its current version 5 is MidJourney, which uses deep learning algorithms to create highly realistic images from scratch. With the help of MidJourney or similar technologies, it is ultimately only up to the imagination of the prompt engineer, and nothing is impossible anymore. One fascinating usecase is giving me goosebumps: Photographs of deceased celebrities how they might look today if they were still alive. Many celebrities have died far too young, leaving fans to wonder what they might look like if they had lived to old age. This is what I came up with by feeding MidJourney with the following prompt: "Artistic portrait of at the age of . Photorealistic, photographed on a Fuji vintage camera with open aperture. Black background. looking into the camera." Hover the images below to show the celebrity's name and age if they would live today.

  • Musk allegedly working on the Truth

    Ironically, Elon Musk, a pathological liar, claims in a recent Fox News interview with Tucker Carlson, also a pathological liar, to be working on an AI called "TruthGPT", which he describes as "maximum truth-seeking". The irony of the story: Just three weeks ago, Musk joined an initiative that had publicly called for the development of OpenAI on "ChatGPT 5" to be paused for at least half a year. Maybe it is more likely that Musk realized that the AI train was already moving at full speed while he was still standing at the station looking at his watch? After all, years ago he himself invested in OpenAI, the company behind the mega bot “ChatGPT”. Must be a pain to see them skyrocket through the roof while his own rocket launch was scrubbed yesterday. In the interview, Musk describes other forms of AI as the greatest threat to humanity that could potentially wipe out our civilization while selling "TruthGPT" as god's gift. "An AI that cares about understanding the universe is unlikely to annihilate humans because we are an interesting part of the universe." - Elon Musk Musk compared AI's hypocritical denial to destroy all of humanity to the way humans deal with chimpanzees: "We recognize humanity could decide to hunt down all the chimpanzees and kill them," Musk said. "We're actually glad that they exist, and we aspire to protect their habitats." - quite ironic, by the way, considering how Musk's company Neuralink treats its animals during tests. The naming is also pretty remarkable: "TruthGPT" is very reminiscent of "Truth Social", Trump's mendacious social network, which Musk himself called the "Rightwing Echo Chamber" just a few weeks ago - but probably only out of offended pride, because Trump has preferred his own platform over a return to Twitter. There is uncertainty regarding the current status of Musk's "TruthGPT" and whether it exists at all. Musk has a track record of making unsupported claims, so launching "TruthGPT" with a falsehood would be ironic but not surprising.

  • The Love Affair with Vintage Computers

    I am writing this on a 33-years old Macintosh and I love it. But where are these deep feelings for yesterday's tech coming from? The Macintosh SE/30 was a personal computer that was manufactured by Apple Inc. and released in January 1989. It was a compact and powerful computer that was designed to be a follow-up to the Macintosh SE, and it quickly became a popular choice among professionals and enthusiasts alike. One of the most notable features of the Macintosh SE/30 was its compact design. The computer was housed in a compact case that was similar in size to the Macintosh SE, but much more powerful. It featured a 16 MHz Motorola 68030 processor, which was a significant improvement over the 8 MHz processor found in the Macintosh SE. This allowed the SE/30 to perform much faster and smoother than its predecessor, making it a great choice for professionals who needed a powerful computer for tasks such as video editing and graphic design. This Macintosh also featured a PDS (Processor Direct Slot) which allowed users to add additional hardware such as a math coprocessor, a SCSI card, or a network card (I have added a SCSI2SD card-reader to this slot, operating a 64GB SD card as an internal hard drive). "The Macintosh SE/30 embodies the same "user-friendly" philosophy as the original Macintosh, while providing more memory, faster performance, and greater expandibility. The Macintosh is still easy to learn, and now it's more powerful than ever." (Macintosh Owner's Guide, 1989) Now - pretty much 33 years after being released on January 19th in 1989 - I'm sitting in front of this beautiful chunk of plastic (while other vintage Macs tend to turn yellow mine remains almost pristine gray-white'ish). The SE and SE/30 were designed by Frog Design, a design agency used by Apple from 1984 to 1990. Hartmut Esslinger created the Snow White design language, used across Apple's product range during this period. Apple spent the latter part of the 90s attempting to move on from the Snow White design language. Flipping the power switch responds with a mechanical "clunk" - a sound people born in the 90s or later can hardly imagine. You can feel the power running through the Macintosh before its display comes to life flickering. Everything about the Macintosh feels clunky and loud. It is a dinosaur, a behemoth from "back then". And everything about it has a charm no modern tech will ever have. People tend to think that our modern tech will receive the same aura 30+ years from now - believe me, it won't. I am actually typing these lines on the original Macintosh keyboard - each key-stroke takes some effort and it feels really exhausting in the beginning - like writing on a typewriter. But the sound and mechanical feedback is so rewarding - I am instantly falling in love with the Macintosh (again). The Macintosh SE/30 is not only a 30+ years old home computer it is a remnant of a whole era I am thankful to have lived in. Everything took long back then. Booting up the operating system (especially from a disk) took minutes! People would think twice before shutting it down because booting it up again would cost time. But why are so many people nowadays seeking refuge in memories from the very first days of home-computing? Why are people like me investing money and time on a plastic cube which lacks all technical standards modern-time computers have? The SE/30 I am typing this on is not even connected to the internet means I am using an external floppy USB-drive to exchange files. And because the drive would not be supported on my M1-Mac I am using an old 2002 iBook to relay files between my very first and latest Mac. One of the reasons why people may be drawn to vintage computers is the nostalgia they evoke. These machines can remind us of our own childhood and the early days of home computing. For example, using a Macintosh SE/30 as an 11-year-old and experiencing the novelty of using a mouse for the first time. It is a reminder of a time when technology was less commonplace and new experiences were exciting. The simplicity of using a mouse to move a cursor on a screen, which is now second nature, was once an amazing discovery which required introduction by my dad. Perhaps these computers are also a reminder of how ephemeral we are. After all, it's not uncommon for people to cling more and more to things that surrounded them during their childhood and adolescence. I would have run out of the room screaming at the thought of watching old commercials from decades past. Today these old videos elicit a pleasant sigh from me. I believe that the attraction to vintage computers and software is rooted in their simplicity and accessibility. It was relatively easy to learn how to program on a Sinclair Spectrum, as the knowledge required was manageable. This simplicity also extended to the industrial and commercial sectors, where software and peripheral devices may have been more complex but still functioned within a well-documented and stable environment. In the pre-internet era, comprehensive manuals were provided with mini-computers, and if additional information was needed, one could contact the designers directly. The resurgence of interest in nostalgic technology, such as vinyl records and instant cameras, has demonstrated a strong desire for vintage technology. Personal computers are no exception. As personal computing reaches its middle age, some people are drawn to revisiting the early days of the technology by restoring and using machines like the iconic Commodore PET from the 1970s. A fascination with early computing equipment and software however is hardly a new thing, but retro-computing seems to be undergoing a real renaissance these days. There are various resources available, such as publications, online marketplaces, and physical stores catering to the demand. Many enthusiasts are restoring and repurposing old devices, as well as emulating or integrating them with newer technology. Examples include using a Raspberry Pi to enhance a Commodore Vic 20. Additionally, there are communities dedicated to playing vintage video games and using "wayback" word processing software. And? Also infected with Nostalgia? A good point to deep-dive into this would be Retro Battlestations on Reddit and Tinker Different a vintage community focusing on Apple computers.

  • Get Your Geek On: Code is the New Black

    Attention all geeks and nerds! It's time to shed that stigma and embrace your inner programmer, because being a programmer is cool! Back in the day, programmers were seen as nothing more than a bunch of socially awkward techies who spent all their time huddled in front of a computer screen. But these days, programming is the new black. Sure, you might still get the occasional eye-roll from your non-techy friends when you start talking about your latest coding project, but they'll be eating their words when they see the big bucks you're raking in. Long gone are the days of the starving programmer, subsisting on a diet of ramen noodles and Red Bull. These days, programmers are the rock stars of the tech world. Now that I am in my mid-40s and miserably failed on becoming a legendary streamer I decided that I want to learn how to code. Don't get me wrong, I love my job and am not chasing a career in software engineering but I wanted to learn a new skill and the thought of me programming kind of appeals to my inner geek. I also know Basic back from my days with a Commodore C64 and some HTML (although that's not considered coding). First things first, you'll need to choose a programming language to learn. So I've asked the good folks over at Mastodon. But don't be fooled by the cute, friendly names like Python or Ruby. These languages are devious, and they will do their best to trip you up at every turn. As you dive into the world of coding, you'll quickly realize that debugging is a full-time job. You'll spend hours staring at a screen, trying to figure out why your code won't run. And just when you think you've finally figured it out, you'll discover that you left off a crucial semicolon and have to start all over again. But don't give up! The joy of finally getting your code to work is worth all the frustration. Just try not to let your celebratory dance moves start a office-wide conga line. I've decided to start with Python an am currently taking courses at Code Academy and downloaded Mimo - a great app that delivers mini-classes in snack-able extent. By now I am capable of creating breathtaking pieces of code like: stock = 600 jeans_sold = 500 target = 500 target_hit = jeans_sold == target print("Hit jeans sale target: ") print(target_hit) current_stock = stock - jeans_sold in_stock = current_stock != 0 print("Jeans in stock:") print(in_stock) Python is a high-level, interpreted programming language. It was first released in 1991 and has since become a popular language for web development, data analysis, and scientific computing. One of the main advantages is its simplicity and readability. It has a relatively small set of keywords and a simple, clean syntax, which makes it easy to learn and use. Python also has a large and active community of users, which means there are many resources available for learning and troubleshooting. Python is used for a wide variety of applications, including web development, data analysis, machine learning, and scientific computing. Instagram, Dropbox and Netflix are among the products written in Python. So if you're ready for a challenge and have a healthy supply of caffeine and patience, go ahead and give coding a try. Just remember, when it comes to learning how to code, it's not about the destination, it's about the journey (and how many times you want to throw your computer out the window along the way). So go ahead and embrace your inner geek. print("End of Story")

  • The Republican's Giant Clown Show

    The start of a new session of the U.S. House of Representatives is a long-standing tradition that takes place on the day of January following a general election. On this day, members of the House come together to begin the process of organizing and conducting the business of the chamber. And good lord what a clown show this year's session has been. It has been fascinating to watch the mainstream media's attempts to understand the reasons behind the dispute between Representative Kevin McCarthy, the supposed leader of the House Republicans, and the approximately 20 members of his own caucus who are blocking his ascension to the position of House speaker. While the situation is certainly not amusing, it is somewhat intriguing to see how the media has tried to make sense of it. Let me break it to you: There is no sense. What we are seeing is the downside of a party composed of fascist trolls. And this is not the first time we are being allowed a glimpse behind the facade of the GOP: On January 3rd, 2021, Kevin McCarthy, the Republican leader in the U.S. House of Representatives, made a bid to be voted as the new speaker of the House. His efforts were unsuccessful, as he was unable to secure the necessary support from his colleagues. This was a surprising turn of events, as McCarthy had been widely expected to win the speaker-ship, given the Republican Party's minority status in the House. However, a group of conservative Republicans, known as the Freedom Caucus, refused to support McCarthy, citing concerns about his leadership style and commitment to conservative principles. As a result, McCarthy was forced to withdraw his bid for the speaker-ship, leaving the House without a clear leader. This has led to uncertainty and infighting within the Republican Party, as members scramble to find a new candidate who can unite the caucus and lead the party forward. Many have criticized McCarthy for his inability to secure the support of his colleagues, arguing that it demonstrates a lack of leadership and vision. Others have blamed the Freedom Caucus for being too inflexible and unwilling to work with McCarthy. Regardless of the reasons for McCarthy's failed bid, it is clear that the Republican Party was facing a leadership crisis at a time when it is already facing significant challenges, including the aftermath of the storming of the U.S. Capitol and the back then looming impeachment of former President Donald Trump. And it seems that Republican history is repeating itself in 2023. Until a presiding officer is selected, the House will be unable to carry out its crucial responsibilities, such as overseeing national security, examining government wrongdoing, and enacting laws. This will result in a halt in Congress's operations. And Kevin McCarthy has dramatically failed on three consecutive days and 11 votes to gather the required majority to be elected Speaker of the House. Something unique in American history in around 164 years thanks to a group of trolls dubbed "Taliban 20" by Team McCarthy. A significant number of the individuals in this group, whose size has ranged from 19 to 21 depending on the vote, belong to the House Freedom Caucus, a group consisting of some of the most conservative Republicans in the House of Representatives. They share geography too as they are dominated by southerners, mainly from Texas, Florida or Arizona. This group is opposed to McCarthy because they believe he will hinder their ultra-right-wing agenda battling President Joe Biden ignoring the fact that all they currently battle is themselves and the GOP in total. "the best season of cspan … ever" - Jon Stewart The Republican Party's inability to choose a speaker also brings former President Donald Trump back into the public eye. After initially blaming Trump for inciting the attack on the U.S. Capitol on January 6th, 2021, McCarthy's relationship with Trump experienced a temporary rift. "You've got to call these people off. They're your people." McCarthy demanded of Trump during a phone call, according to Congresswoman Jamie Herrera-Beutler. "Well, Kevin, I guess they are just more upset about the election theft than you are," Trump replied. However, McCarthy has since become a vocal defender of Trump who is currently unable to wield the same level of control over the Republican Party as he once did and linked his own political significance to McCarthy's success or failure. How this is ending? The voting has once again adjourned. A number of "Never Kevins" - notably Matt Gaetz of Florida who today voted for Donald Trump - just a day ahead of the anniversary of the Storm of the Capitol, Andy Biggs of Arizona and Lauren Boebert of Colorado - have been clear that no amount of compromise will change their minds on opposing the California congressman. The members of the House of Representatives will continue voting until they reach a decision. In the past, the election for speaker has taken the longest in 1856, when Nathaniel Prentice Banks won after receiving a plurality of votes from members, rather than an absolute majority, after a whopping 133 ballots. It is possible that the current vote for speaker could go on for days or even weeks. One unlikely option to resolve the stalemate is to follow the example set in 1856, where lawmakers passed a resolution allowing a speaker to be chosen by a plurality vote instead of a simple majority. However, this would be a risky move for the Republican House leadership, as a divided Republican vote could lead to the leader of the Democrats, Hakeem Jeffries, being given the role of speaker. The resulting divisions within the Republican Party and the events that may follow could make their House majority unworkable. A clear signal that the self-radicalizing clown show called GOP is not capable of governing.

  • The Fascinating Science Behind Our Perception of Time

    The start of a new year is a time to reflect on the past and think about what we want to accomplish in the future. It's a time to let go of any regrets or disappointments from the previous year and move forward with a positive attitude. But in the end it is all about our understanding of time itself. The concept of time has been a fundamental aspect of human existence for as long as we have been able to measure and record it. From the earliest civilizations to the present day, we have been obsessed with keeping track of time and using it to organize our lives. But where did this concept come from, and how have we come to understand it? The earliest humans likely had a very different understanding of time than we do today. They probably had a more cyclical view of time, with the passing of the seasons and the movements of the sun and moon marking the passage of time. It wasn't until the invention of the first clocks, which allowed us to measure time more precisely, that we were able to break free from this cyclical view of time and understand it in a more linear fashion. It's not accurate to say that anyone "invented" time, as time is a fundamental aspect of the universe and exists independently of human measurement or perception. However, the concept of time and our understanding of it have evolved and developed over the course of human history. One of the earliest known methods for keeping track of time was the sundial, which was used by the ancient Egyptians as early as 3500 BC. The sundial worked by using the position of the sun in the sky to cast shadows on a flat surface, with the shadows moving in a predictable pattern as the sun moved across the sky. This allowed the Egyptians to divide the day into smaller units of time, such as hours and minutes, which they used to organize their daily lives. Over time, the concept of time and our understanding of it have evolved and grown more complex. The development of mechanical clocks in the 14th century allowed for even more precise measurement of time, and the invention of the pendulum clock in the 17th century made it possible to create highly accurate timepieces. With the advent of electronic clocks in the 20th century, timekeeping became even more precise, and today we have a wide range of technologies that allow us to measure and record time with incredible accuracy. It's difficult to imagine what the world would look like without the concept of time, as it is such a fundamental aspect of our lives and the way we understand the world around us. It is a fundamental aspect of the universe and exists independently of human measurement or perception, so even if we didn't have a way to measure or understand it, it would still be present. Without the concept of time, it's likely that our understanding of the world and our place in it would be very different. We rely on the passage of time to understand the cause and effect relationships between events, and to predict what will happen in the future. Without a way to measure or understand time, it would be much harder to make sense of the world around us. We use time to organize our daily lives and to plan for the future, and many of the systems and structures that we rely on, such as work- and school schedules, are based on the concept of time. The concept of time has both pros and cons, depending on how it is used and understood. However it can also create a sense of pressure or stress, as we often feel the need to use our time efficiently and accomplish as much as possible. Time can create a sense of impermanence and transience, as everything is subject to the passage of time and will eventually come to an end which eventually leads to a focus on the past or the future, rather than the present moment, which can make it harder to appreciate and enjoy the present. It's a common perception that time seems to fly by faster as we get older, and there are a few different explanations for this phenomenon. For one our brain's perception of time changes as we age. As we get older, our brain's processing speed slows down, which can make time seem to pass more quickly. Additionally, the hippocampus, a part of the brain that is involved in memory and perception of time, tends to shrink with age, which can also affect our perception of time. When we are young, every year represents a significant proportion of our total life experience, so it can feel like a longer period of time. As we get older, each year represents a smaller portion of our total life experience, which can make it feel like time is flying by. Returning to the initial thought about this time of the year with 2023 just around the corner our perception of time becomes significantly more complicated if you also take time-zones into account. On January 30th at 10am GMT it will be 2023 in Samoa. Over 24hrs it will sequentially become 2023 across 24 time zones. "Each zone a construct, as is the arbitrary notion that one year ends and another starts at a specific moment", Ted Hunt wrote in a scientific essay in 2018. "Why was the time traveler always so calm? Because he knew that everything was relative!" "Let's get rid of time zones" you say? There have been various proposals throughout history to adjust or eliminate time zones, but none of these have been successful in completely eliminating the concept of time zones. Time zones are used to divide the Earth's surface into 24 roughly equal areas, with each area using a standard time that is based on the position of the sun in the sky so creating a unified time zone unfortunately is not an option. Additionally, the Earth's rotation is not uniform, and time zones are used to account for this irregularity, so eliminating time zones would require finding a different way to account for this variability. The average human lifespan is absurdly, insultingly brief. Assuming you live to be eighty, you have just over four thousand weeks. Maybe I should stop worrying about time and start living in the present. Oh and if you are still looking for a topic-related great read I would like to recommend "Four Thousand Weeks: Time Management for Mortals" by Oliver Burkeman. Have a wonderful start into a new year - one time zone at a time.

  • The Surprising Benefits of Putting Things Off

    As the year comes to a close, it can be tempting to frantically try to tie up loose ends, make last-minute plans, and finish everything on our to-do lists. But taking a break and doing nothing can be just as important as being productive. First and foremost, taking time to rest and relax can be beneficial for our mental health. The end of the year can be a particularly stressful time, with holiday plans, end-of-year deadlines, and the pressure to reflect on the past year and make resolutions for the new one. Taking a break from all of that can help us recharge and reset. In addition to the mental health benefits, doing nothing can also be good for our physical health. When we're constantly on the go and multitasking, it can be easy to neglect our bodies' needs for rest and recovery. Taking time to relax can help reduce stress, improve sleep, and give our bodies a chance to rest and rejuvenate. But it's not just about resting and relaxing – doing nothing can also be a time to reflect and rejuvenate. The end of the year is a natural time to look back on the past year and consider what we've accomplished, as well as what we want to work on in the future. Taking a break from our normal routines can give us the space and time we need to really think about these things and come up with plans and goals that are meaningful to us. Procrastination has a bad reputation, often being associated with laziness or a lack of discipline. However, recent research suggests that procrastination may not always be a negative thing and can even have some benefits. One potential benefit of procrastination is that it can help us prioritize. When we have a lot of tasks on our plate, it can be overwhelming to try to tackle everything at once. By waiting until the last minute to complete a task, we are essentially forcing ourselves to prioritize and focus on what is most important. This can help us be more efficient and avoid wasting time on tasks that may not be as crucial. Another benefit is that it can increase our creativity and problem-solving skills. When we are under pressure to complete a task, we may be more inclined to think outside the box and come up with creative solutions. This can be especially helpful when we are faced with a difficult or challenging task. Procrastination can also be a way of coping with stress or anxiety. When we are feeling overwhelmed, it can be helpful to take a break and put off tasks until we feel more capable of tackling them. This can help us avoid burnout and ensure that we are able to complete tasks to the best of our ability. Of course, it's important to find a balance when it comes to procrastination. If we constantly put off tasks until the last minute, it can lead to negative consequences such as missed deadlines, poor performance, and increased stress. However, if we are mindful of our procrastination habits and use them as a way to prioritize and increase our creativity, it can be a helpful tool rather than a hindrance. In conclusion, while procrastination may have a negative reputation, it can actually have some benefits if we use it mindfully. By prioritizing tasks, increasing our creativity, and coping with stress, procrastination can be a helpful tool rather than a hindrance. However, it's important to find a balance and not let procrastination interfere with our ability to complete tasks effectively. So, as the end of the year approaches, don't be afraid to take a break and do nothing. Whether it's taking a few hours to sit and relax, or taking a few days off to rest and recharge, doing nothing can be just as important as being productive. Trust yourself to know when you need a break, and give yourself the gift of rest and relaxation. You'll be glad you did.

  • Stuck in the echo chamber of hate

    Twitter has become a ticking time-bomb and it's getting worse every day. And yet we still have not seen the mass-exodus of journalists everyone would have expected after Musk banned a dozen of them for flimsy reasons, claiming they would have been a threat to his safety. Why is the media so dependent on Twitter that it's holding itself hostage on a platform that's obviously turning against them? We as journalists should have left Twitter the moment colleagues were banned for reporting critical about the lone self-proclaimed ruler. All together. We could have made an impact as Twitter w/o journalists would basically fall back to posting images of cats. But we stayed, being too fascinated, too scared to miss something “worth reporting”. Major news outlets lack the courage to make the first step e.g. to Mastodon and Elon Musk is aware of exactly that. We’re selling ourselves to a narcissist who’s making fun of the press and what it stands for. We really should have left. Why can’t we? In May 2021, journalist Dave Lee, who covers technology for BBC News, was temporarily banned from Twitter after he tweeted about Musk's behavior on the platform. Lee argued that Musk's tweets, which included false and misleading information, were harmful to the public and called for Twitter to hold him accountable. However, Twitter argued that Lee's tweets violated the platform's rules against harassment and abuse, and banned him as a result. The ban of Dave Lee sparked a wider conversation about the role of social media platforms in regulating the behavior of high-profile users, particularly those with a large influence on public opinion and the markets. Some argued that Twitter has a responsibility to hold all users, including Musk, to the same standards and that the ban of journalists like Lee undermines the credibility of the platform. Others defended Twitter's right to enforce its terms of service and argued that journalists should be held to a higher standard of accuracy and professionalism on social media. Ultimately, the ban of journalists on Twitter raises important questions about the balance between free expression and accountability on social media platforms. While it is important to protect the freedom of journalists to report and share information, it is also important for social media platforms to enforce their rules and hold all users accountable for their actions. As the role of social media in public discourse continues to evolve, it will be important for these platforms to find a way to strike this balance in a fair and transparent manner. Twitter obviously made a decision against this. But it is getting worse: Musk announced on Saturday that Twitter "will start incorporating mute & block signals from Blue Verified (not Legacy Blue) as downvotes". Taking into account that the majority of upcoming subscribers will be the army of goons loyal to Elon Musk, this will result in unwanted critics being down-voted, creating echo chambers of hate. Don’t forget Twitter doesn’t rank among the top 5 social networks, not even top 10! It’s the 15th most popular social media in the world in terms of users - even though it’s most likely by far the loudest. There are over 1.3 billion Twitter accounts, but only 237.8 million of them are monetizable monthly active users. What does this tell you about the quality of users? Twitter has not been about conversation and debate anymore for a long time. These days it is all about broadcasting and showing off. Other than Mastodon where there is actual interaction and great dialogue. "Twitter is a war zone. Thank god it’s just a digital village square and not a real one. The stakes are still high, but they’re not life & death." - Elon Musk, 2018 As for myself: I had my finger hovering above the "Deactivate your Account" button multiple times. But right now I feel like not being able to observe and comment on Elon Musk might be the bigger evil.

  • The Fiery Tactics of Elon Musk

    Over the years, Elon Musk has developed a reputation for being outspoken and controversial on Twitter. His tweets have been known to cause significant fluctuations in the stock market, and he has been criticized for using the platform to make baseless claims or attack his critics. Some have even noticed a "Trumpification" of Elon Musk, comparing his behavior to that of former President Donald Trump, who was known for his combative and divisive tweets while in office. So, what strategies might Musk be using to behave like Trump on Twitter? Here are a few possibilities: Playing to his audience: Both Musk and Trump are known for their ability to connect with their followers on a personal level. They both have a knack for understanding what their audience wants to hear and for using social media to speak directly to them. Using Twitter as a megaphone: Both have used Twitter as a way to amplify their voices and get their message out to a wider audience. They both have millions of followers on the platform, which gives them a significant platform to spread their views. Stoking controversy: Both have been known to court controversy on Twitter, often making statements that are designed to provoke a strong reaction. This can be a effective way to grab attention and create headlines, but it can also be a risky strategy that can backfire. Attacking critics: Both Musk and Trump have a history of using Twitter to attack their critics and those who disagree with them. This can be a way to deflect criticism and deflect attention from negative news, but it can also fuel further conflict and alienate potential allies. Both have gathered some type of follower-army they can direct towards potential targets like most recently Yoel Roth and Anthony Fauci. Increasing daily active users: One aspect of Musk's social media presence that has garnered significant attention is his use of provocative tweets to increase his daily active users. These tweets, which often contain controversial or divisive content, have proven to be extremely effective at driving engagement and attracting new followers. There are a few reasons why Musk's provocative tweets are so successful at increasing activity. First, they tend to generate a strong emotional response in his followers, whether it be excitement, anger, or something in between. This emotional response leads to higher levels of engagement, as people are more likely to share, comment, and like a tweet that elicits a strong emotional reaction. Second, Musk's tweets often garner media attention, which leads to increased visibility for his social media accounts. This visibility can attract new followers who may not have heard of Musk or his companies before, but are now interested in learning more about him and his work. Finally, Musk's provocative tweets often spark debates and discussions online, which can further increase engagement and attract new followers. These discussions can range from serious discussions about important topics to more lighthearted debates about the latest news or trends or simply posting about popular buzz-topics like "Elden Ring" becoming Game of the Year. It is difficult to say with certainty whether Elon Musk's Twitter strategy has backfired or not, as the success or failure of a social media strategy can depend on a variety of factors and can be subjective. Musk's Twitter strategy has certainly garnered significant attention and engagement, and has helped him to attract a large following on the platform. However, his tweets have also been known to be controversial or divisive, and some people may view his use of provocative content as problematic or unethical. In some cases, Musk's tweets have sparked backlash or negative media attention, which could be seen as a failure of his social media strategy. For example, in 2018, Musk faced criticism and backlash after he tweeted that he was considering taking Tesla private and had secured funding to do so, which turned out to be false. This incident led to an investigation by the Securities and Exchange Commission (SEC) and resulted in Musk being fined and forced to step down as Tesla's chairman.

  • A Love Letter to the 80s

    As a geek dad born in the late 70s, I have a deep love for the 80s and all things nerdy. I grew up during the heyday of arcade gaming, and spent countless hours playing classics like Pac-Man and Space Invaders and will never forget the day I saw Star Wars, War Games, The Goonies or Back to the Future for the very first time. This is a love letter to the best decade ever happened to mankind. The 80s was a time of great cultural significance, with a plethora of iconic movies, music, and video games that continue to captivate audiences to this day. From the advent of the personal computer to the rise of synth-pop, the 80s were a time of great innovation and creativity. In the realm of cinema, the 80s saw the release of numerous now-classic films. From the epic space opera of Star Wars and the time-traveling adventures of Back to the Future, to the dystopian future of Blade Runner and the coming-of-age tale of The Breakfast Club, the 80s were a golden age of storytelling. And let's not forget the horror genre, which was dominated by the likes of A Nightmare on Elm Street, The Evil Dead, and Friday the 13th. The 80s were also a time of great musical innovation, with the emergence of new genres like synth-pop and hip hop. Bands like Depeche Mode, The Smiths, and New Order took the world by storm with their catchy melodies and electronic beats, while artists like Run-DMC, LL Cool J, and Public Enemy helped pave the way for the rise of hip hop. The 80s also saw the rise of the music video, with iconic clips like Michael Jackson's "Thriller" and Duran Duran's "Hungry Like the Wolf" becoming instant classics. And of course they were a time of great technological advancement, with the personal computer revolutionizing the way we live and work. The first Apple Macintosh was released in 1984, and the advent of the World Wide Web in 1989 laid the groundwork for the internet as we know it today. But perhaps the most enduring legacy of the 80s is the explosion of video games. The release of the Atari 2600 in 1977 kicked off the golden age of arcade gaming, and by the 80s, the industry was booming. Games like Pac-Man, Donkey Kong, and Space Invaders became cultural phenomena, while the emergence of home consoles like the Commodore 64 and the Nintendo Entertainment System brought gaming into the living room. Growing up in the 80s, I was surrounded by the magic of Lucasfilm. The first time I played Maniac Mansion on my Commodore 64, I was hooked. The clever puzzles, the quirky characters, and the hilarious dialogue – it was like nothing I had ever seen before. And when I discovered Zak McKracken and the Alien Mindbenders, I was equally enthralled. The interdimensional travel, the ancient alien artifacts – it was the stuff of pure geeky bliss. All in all, the 80s were a time of great fascination and nostalgia for many people. From the epic sci-fi movies to the catchy synth-pop tunes, the 80s were a truly unique and unforgettable decade. And with the continued popularity of retro gaming and the resurgence of 80s-inspired music and fashion, it's clear that the fascination with the 80s will never truly fade. As a geek dad, I am grateful to have grown up in the 80s, surrounded by the amazing creations of Lucasfilm and the talents of Spielberg and Lucas. And now, as I share these beloved adventures with my own kid, I am constantly reminded of the timeless appeal of the 80s and the power of imagination. Long live the geeky goodness of the 80s!

  • All that fuzz about AI images

    Elon Musk kissing Donald Trump? Kanye West on the moon? It takes just a few seconds and a good prompt for your wildest dreams to come true. The hype around AI generated images has been growing in recent years, and for good reason. AI has the ability to generate incredibly realistic images that are almost indistinguishable from those created by humans - if not better. I just recently toyed around with some of the more popular AI services like Photoleap and Midjourney and the results were mind-boggling. AI can create images in a variety of ways, depending on the type of AI model that is being used and the desired outcome. One way that AI can create images is by using a generative model, which is trained on a large dataset of images and then uses that knowledge to generate new images that are similar to the ones it was trained on. This can be done using a variety of techniques, such as deep learning or evolutionary algorithms. Another way that AI can create images is by using a neural network to process and analyze existing images, and then using that information to generate new images based on the patterns it has learned. This can be used, for example, to generate images that have a specific style or to modify existing images in a specific way. There are dozens of services inviting users to test their AI during a trial. While Midjourney is hosting a bot on Discord ($10 per month for approx. 200 images), Lighticks Photoleap offers a comfortable iOS app ($99 per year w/o limitations). They all work similar: You describe the image you want the AI to generate via prompt. The more detail, the better but basic prompts like "Man on a horse" already lead to stunning results. After a few seconds the AI usually presents up to 4 variations of your query. You can then either save the image you like best or have the AI create variations of your favorite. This way you are perfecting the result to the best possible image. It is difficult to say who invented AI-generated images, as the technology has evolved over time and has been developed by many different people and organizations. The field of AI has its roots in the work of many researchers and scientists over the years, including Alan Turing, who is widely considered to be the father of modern computing. In terms of the specific technology that is used to generate images with AI, there have been many notable contributors and advancements. Some key figures in the development of AI-generated images include Geoffrey Hinton, Yann LeCun, and Andrew Ng, who are all pioneers in the field of deep learning. Additionally, many companies and organizations have contributed to the development of AI-generated images, including Google, Facebook, and OpenAI. One of the reasons for this hype is the sheer potential of AI generated images. In the past, creating high-quality images required a great deal of skill and time from artists. But with AI, anyone can generate stunning images with just a few clicks of a button. In seconds. This opens up a whole new world of possibilities for businesses and individuals alike. Another reason for the recent hype are rapid advancements in AI technology. As AI continues to improve, it is becoming increasingly capable of generating even more realistic and sophisticated images. This means that the potential applications of AI generated images are only going to expand in the future. One of the most exciting potential applications of AI generated images is in the field of graphic design. AI can be used to create stunning graphics and visuals that would be difficult or impossible for humans to create on their own. This could be a huge time-saver for graphic designers, allowing them to focus on the creative aspects of their work rather than spending hours on tedious tasks. Another potential application of AI generated images is in the field of virtual reality. With AI, it is possible to create incredibly realistic virtual environments that are indistinguishable from the real world. This could be used to create immersive virtual experiences for gaming, education, and many other fields. As AI technology continues to advance and become more widely used, it is likely that we will see more and more exhibitions and showcases of AI-generated art. But there has also been some criticism about AI generated images, particularly with regard to the potential for AI to be used to create fake or manipulated images. Some people are concerned that AI-generated images could be used to spread misinformation or deceive people. Additionally, some critics argue that the use of AI to generate images can be a form of "digital blackface," as it allows people to create images of individuals without their consent or control over how they are represented. Additionally, AI-generated images can be used to create deepfake videos, which can be used to spread false information or to impersonate someone else. As such, it is important to be cautious when encountering AI-generated images, and to verify the authenticity of any information that is presented along with them. Overall, the hype around AI generated images is well-deserved. As AI technology continues to improve, we can expect to see even more impressive and sophisticated images being generated by AI. This will open up a whole new world of possibilities for businesses and individuals alike, and we can't wait to see what the future holds. “The development of full artificial intelligence could spell the end of the human race….It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” — Stephen Hawking

  • Greetings from Skynet

    Well, almost. ChatGPT is a language model developed by OpenAI, an artificial intelligence (AI) research laboratory founded in late 2015 by Elon Musk (yes, him again) and Sam Altman. It is designed to be able to generate natural language text in response to user inputs. The model is based on the GPT-3 architecture, which uses a combination of deep learning and natural language processing techniques to generate text that is similar to human-generated language. On the positive side, ChatGPT has shown impressive capabilities in generating text that is coherent and relatively natural-sounding. In tests, the model has been able to generate responses to user inputs that are relevant and on-topic, and in some cases it has even been able to engage in complex conversations that span multiple topics. This ability to generate coherent and engaging responses has the potential to be useful in a variety of applications, such as chatbots, content generation, and customer service. However, there are also several potential drawbacks to using ChatGPT. One of the biggest concerns is the potential for the model to generate biased or offensive text. Because ChatGPT is trained on large amounts of text from the internet, it is likely to reflect the biases and prejudices that are present in that text. This can lead to the generation of offensive or discriminatory language, which can be harmful and damaging. Artificial intelligence (AI) has the potential to greatly benefit mankind, with applications in fields such as healthcare, transportation, and manufacturing. However, there are also potential dangers associated with the development and use of AI. One of the biggest concerns about AI is the potential for it to be used for malicious purposes. As AI technology becomes more advanced, it is possible for it to be used to develop weapons and other tools that could be used to cause harm (hence the "Skynet" reference in the post title). For example, AI could be used to develop autonomous military drones that are capable of making decisions about who to attack without human intervention. This could lead to a proliferation of AI-powered weapons and increase the risk of violent conflict. Another potential danger of AI is the potential for it to be used to violate privacy and personal rights. As AI systems become more sophisticated, it is possible for them to be used to monitor and track individuals without their consent. This could lead to a loss of privacy and the potential for abuse of personal information. Additionally, there is the potential for AI to cause job displacement. As AI systems become more capable of performing tasks that were previously done by humans, there is a risk that these systems could replace human workers, leading to widespread unemployment and economic disruption. Overall, while AI has the potential to bring many benefits to mankind, it is important to carefully consider the potential dangers and take steps to mitigate them. This may include the development of ethical guidelines for the use of AI, as well as the regulation of AI technology to ensure that it is used for the benefit of society as a whole. Another potential issue with ChatGPT in its current stat is that it may not always generate text that is completely accurate or truthful. Because the model is not grounded in any external knowledge or context, it is possible for it to generate text that is misleading or factually incorrect. This could be particularly problematic in situations where the generated text is used as the basis for decision-making or other important actions. Overall, while ChatGPT has shown impressive capabilities in generating natural language text, it is important to be aware of the potential drawbacks and limitations of the model. It is important to carefully consider the potential biases and inaccuracies of the generated text, and to use the model with appropriate caution and oversight. This blog article was completely generated by ChatGPT in 3,2 seconds.

bottom of page