[{"data":1,"prerenderedAt":3207},["ShallowReactive",2],{"navigation":3,"latest-posts":38},[4],{"title":5,"path":6,"stem":7,"children":8,"page":37},"Posts","\u002Fposts","posts",[9,13,17,21,25,29,33],{"title":10,"path":11,"stem":12},"Agents, MCP, RAG, Knowledge Graphs, all open source and local","\u002Fposts\u002Fagents_mcp_rag_local_foss","posts\u002Fagents_mcp_rag_local_foss",{"title":14,"path":15,"stem":16},"DeGoogle your phone","\u002Fposts\u002Fdegoogle_your_phone","posts\u002Fdegoogle_your_phone",{"title":18,"path":19,"stem":20},"Homelab Automation","\u002Fposts\u002Fhomelab_automation","posts\u002Fhomelab_automation",{"title":22,"path":23,"stem":24},"How I discovered Static Site Generators","\u002Fposts\u002Fhow_i_discovered_ssg","posts\u002Fhow_i_discovered_ssg",{"title":26,"path":27,"stem":28},"My Linux Journey","\u002Fposts\u002Fmy_switch_to_linux","posts\u002Fmy_switch_to_linux",{"title":30,"path":31,"stem":32},"Own your Data","\u002Fposts\u002Fown_your_data","posts\u002Fown_your_data",{"title":34,"path":35,"stem":36},"Self-host your AI assistant","\u002Fposts\u002Fself_host_your_ai_assistant","posts\u002Fself_host_your_ai_assistant",false,[39,895,2064,2388,2645,2931],{"id":40,"title":18,"body":41,"date":880,"description":881,"extension":882,"image":883,"meta":884,"navigation":423,"path":19,"readingTime":885,"seo":886,"stem":20,"tags":887,"__hash__":894},"content\u002Fposts\u002Fhomelab_automation.md",{"type":42,"value":43,"toc":866},"minimark",[44,49,86,91,101,127,131,134,153,161,165,180,337,343,514,521,525,528,532,543,603,613,639,645,649,662,668,671,700,706,715,729,733,736,745,756,759,775,778,787,794,797,801,804,811,845,849,859,862],[45,46,48],"h2",{"id":47},"from-chaos-to-order-with-renovate-gitlab-cicd","From chaos to order with Renovate & Gitlab CI\u002FCD",[50,51,52,53,57,58,61,62,69,70,75,76,80,81,85],"p",{},"I think everyone self-hosting eventually has been there. You’re setting up your first homelab, the excitement is high,\nand your ",[54,55,56],"code",{},"compose.yaml"," is littered with the most dangerous word in DevOps: ",[54,59,60],{},":latest",".\nIt works perfectly until it doesn't. One morning you wake up, look at your ",[63,64,68],"a",{"href":65,"rel":66},"https:\u002F\u002Fgithub.com\u002Fcontainrrr\u002Fwatchtower",[67],"nofollow","watchtower","\nservice updating everything, and your entire stack in ",[63,71,74],{"href":72,"rel":73},"https:\u002F\u002Fwww.portainer.io\u002F",[67],"Portainer"," is a smoking crater...\nI realized that I had to stop being lazy and start treating it like some kind of production.\nHere is how I moved from ",[77,78,79],"em",{},"beginner’s luck"," to a fully automated, ",[82,83,84],"strong",{},"GitLab CI\u002FCD and Renovate"," pipeline.",[87,88,90],"h3",{"id":89},"the-root-cause-the-latest-trap","The Root Cause: The \"Latest\" Trap",[50,92,93,94,96,97,100],{},"The mistake wasn't just using ",[54,95,60],{},"; it was the ",[82,98,99],{},"lack of intent",". Using generic tags means you never have to pay attention.\nTo have a more controlled setup, I needed three things:",[102,103,104,115,121],"ul",{},[105,106,107,110,111,114],"li",{},[82,108,109],{},"Pinning:"," Total control over what version is running (e.g., sticking to ",[82,112,113],{},"3.6"," until I’ve personally verified the upgrade path).",[105,116,117,120],{},[82,118,119],{},"Visibility:"," To be notified when a new version exists without manually checking dozens of GitHub repositories.",[105,122,123,126],{},[82,124,125],{},"Automation:"," A way to deploy those updates that didn't involve me manually SSH-ing into a terminal for every minor patch.\nBut keeping the option just in case...",[87,128,130],{"id":129},"a-solution-renovate","A Solution: Renovate",[50,132,133],{},"I was already tracking docker compose and configuration for services in a private repository in my gitlab.com account.\nIt seemed a perfectly decent way to keep track of configuration, and also some documentation by service in a few Markdown files.\nIn the end, it turned out as a perfect first step to work on GitOps and server admin tasks.",[50,135,136,137,140,141,144,145,148,149,152],{},"I also read about renovate, and it's Gitlab integration.\nBasically, it can scan my homelab repository and look for outdated Docker tags.\nInstead of just breaking things, Renovate opens a ",[82,138,139],{},"Merge Request",".\nIt’s like a librarian handing me a book and saying, ",[77,142,143],{},"\"Hey, there's a new version. I've gathered the release notes\nand changelogs for you (if you add a GitHub API token). Do you want to merge this?\"","\nThis allows for ",[82,146,147],{},"Intentional Upgrading",". If I see an update for a critical service, I can read or search for breaking changes\nbefore clicking ",[77,150,151],{},"Merge",".",[50,154,155],{},[156,157],"img",{"alt":158,"src":159,"title":160},"Mend Renovate CLI","\u002Fposts\u002Fhomelab_automation\u002Fmend-renovate-cli-banner.jpg","Renovate, the game changer.",[87,162,164],{"id":163},"renovate-gitlab-pipeline-in-practice","Renovate GitLab Pipeline in practice",[50,166,167,168,171,172,175,176,179],{},"Fortunately, Gitlab offers a built-in ",[82,169,170],{},"CI\u002FCD pipeline"," that includes Renovate to simplify the configuration. First,\nyou need to produce an access token with scopes are read_api, read_repository and write_repository for renovate.\nYou can do this in Gitlab's UI under Settings > Access Tokens. After that, add a variable to your Gitlab CI\u002FCD pipeline\nnamed ",[54,173,174],{},"RENOVATE_TOKEN"," and set it to your token. Then, you just need to add a few lines of\nconfiguration in your ",[54,177,178],{},".gitlab\u002Frenovate.json"," file:",[181,182,187],"pre",{"className":183,"code":184,"language":185,"meta":186,"style":186},"language-json shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","{\n  \"$schema\": \"https:\u002F\u002Fdocs.renovatebot.com\u002Frenovate-schema.json\",\n  \"extends\": [\"config:base\"],\n  \"enabledManagers\": [\"docker-compose\"],\n  \"assignees\": [\"YourGitlabUsername\"],\n  \"labels\": [\"dependencies\", \"renovate\"]\n}\n","json","",[54,188,189,198,226,251,274,297,331],{"__ignoreMap":186},[190,191,194],"span",{"class":192,"line":193},"line",1,[190,195,197],{"class":196},"sMK4o","{\n",[190,199,201,204,208,211,214,217,221,223],{"class":192,"line":200},2,[190,202,203],{"class":196},"  \"",[190,205,207],{"class":206},"spNyl","$schema",[190,209,210],{"class":196},"\"",[190,212,213],{"class":196},":",[190,215,216],{"class":196}," \"",[190,218,220],{"class":219},"sfazB","https:\u002F\u002Fdocs.renovatebot.com\u002Frenovate-schema.json",[190,222,210],{"class":196},[190,224,225],{"class":196},",\n",[190,227,229,231,234,236,238,241,243,246,248],{"class":192,"line":228},3,[190,230,203],{"class":196},[190,232,233],{"class":206},"extends",[190,235,210],{"class":196},[190,237,213],{"class":196},[190,239,240],{"class":196}," [",[190,242,210],{"class":196},[190,244,245],{"class":219},"config:base",[190,247,210],{"class":196},[190,249,250],{"class":196},"],\n",[190,252,254,256,259,261,263,265,267,270,272],{"class":192,"line":253},4,[190,255,203],{"class":196},[190,257,258],{"class":206},"enabledManagers",[190,260,210],{"class":196},[190,262,213],{"class":196},[190,264,240],{"class":196},[190,266,210],{"class":196},[190,268,269],{"class":219},"docker-compose",[190,271,210],{"class":196},[190,273,250],{"class":196},[190,275,277,279,282,284,286,288,290,293,295],{"class":192,"line":276},5,[190,278,203],{"class":196},[190,280,281],{"class":206},"assignees",[190,283,210],{"class":196},[190,285,213],{"class":196},[190,287,240],{"class":196},[190,289,210],{"class":196},[190,291,292],{"class":219},"YourGitlabUsername",[190,294,210],{"class":196},[190,296,250],{"class":196},[190,298,300,302,305,307,309,311,313,316,318,321,323,326,328],{"class":192,"line":299},6,[190,301,203],{"class":196},[190,303,304],{"class":206},"labels",[190,306,210],{"class":196},[190,308,213],{"class":196},[190,310,240],{"class":196},[190,312,210],{"class":196},[190,314,315],{"class":219},"dependencies",[190,317,210],{"class":196},[190,319,320],{"class":196},",",[190,322,216],{"class":196},[190,324,325],{"class":219},"renovate",[190,327,210],{"class":196},[190,329,330],{"class":196},"]\n",[190,332,334],{"class":192,"line":333},7,[190,335,336],{"class":196},"}\n",[50,338,339,340,179],{},"and add jobs to your ",[54,341,342],{},".gitlab-ci.yml",[181,344,348],{"className":345,"code":346,"language":347,"meta":186,"style":186},"language-yaml shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","include:\n  - project: 'renovate-bot\u002Frenovate-runner'\n    file: '\u002Ftemplates\u002Frenovate-config-validator.gitlab-ci.yml'\n  - project: 'renovate-bot\u002Frenovate-runner'\n    file: '\u002Ftemplates\u002Frenovate.gitlab-ci.yml'\n\nrenovate:\n  stage: renovate\n  variables:\n    RENOVATE_EXTRA_FLAGS: --autodiscover=false\n    RENOVATE_ALLOW_POST_UPGRADE_COMMANDS: \"true\"\n    RENOVATE_REPOSITORIES: $CI_PROJECT_PATH\n  rules:\n    - if: '$CI_PIPELINE_SOURCE == \"schedule\"'\n","yaml",[54,349,350,359,378,392,406,419,425,431,442,450,461,477,488,496],{"__ignoreMap":186},[190,351,352,356],{"class":192,"line":193},[190,353,355],{"class":354},"swJcz","include",[190,357,358],{"class":196},":\n",[190,360,361,364,367,369,372,375],{"class":192,"line":200},[190,362,363],{"class":196},"  -",[190,365,366],{"class":354}," project",[190,368,213],{"class":196},[190,370,371],{"class":196}," '",[190,373,374],{"class":219},"renovate-bot\u002Frenovate-runner",[190,376,377],{"class":196},"'\n",[190,379,380,383,385,387,390],{"class":192,"line":228},[190,381,382],{"class":354},"    file",[190,384,213],{"class":196},[190,386,371],{"class":196},[190,388,389],{"class":219},"\u002Ftemplates\u002Frenovate-config-validator.gitlab-ci.yml",[190,391,377],{"class":196},[190,393,394,396,398,400,402,404],{"class":192,"line":253},[190,395,363],{"class":196},[190,397,366],{"class":354},[190,399,213],{"class":196},[190,401,371],{"class":196},[190,403,374],{"class":219},[190,405,377],{"class":196},[190,407,408,410,412,414,417],{"class":192,"line":276},[190,409,382],{"class":354},[190,411,213],{"class":196},[190,413,371],{"class":196},[190,415,416],{"class":219},"\u002Ftemplates\u002Frenovate.gitlab-ci.yml",[190,418,377],{"class":196},[190,420,421],{"class":192,"line":299},[190,422,424],{"emptyLinePlaceholder":423},true,"\n",[190,426,427,429],{"class":192,"line":333},[190,428,325],{"class":354},[190,430,358],{"class":196},[190,432,434,437,439],{"class":192,"line":433},8,[190,435,436],{"class":354},"  stage",[190,438,213],{"class":196},[190,440,441],{"class":219}," renovate\n",[190,443,445,448],{"class":192,"line":444},9,[190,446,447],{"class":354},"  variables",[190,449,358],{"class":196},[190,451,453,456,458],{"class":192,"line":452},10,[190,454,455],{"class":354},"    RENOVATE_EXTRA_FLAGS",[190,457,213],{"class":196},[190,459,460],{"class":219}," --autodiscover=false\n",[190,462,464,467,469,471,474],{"class":192,"line":463},11,[190,465,466],{"class":354},"    RENOVATE_ALLOW_POST_UPGRADE_COMMANDS",[190,468,213],{"class":196},[190,470,216],{"class":196},[190,472,473],{"class":219},"true",[190,475,476],{"class":196},"\"\n",[190,478,480,483,485],{"class":192,"line":479},12,[190,481,482],{"class":354},"    RENOVATE_REPOSITORIES",[190,484,213],{"class":196},[190,486,487],{"class":219}," $CI_PROJECT_PATH\n",[190,489,491,494],{"class":192,"line":490},13,[190,492,493],{"class":354},"  rules",[190,495,358],{"class":196},[190,497,499,502,505,507,509,512],{"class":192,"line":498},14,[190,500,501],{"class":196},"    -",[190,503,504],{"class":354}," if",[190,506,213],{"class":196},[190,508,371],{"class":196},[190,510,511],{"class":219},"$CI_PIPELINE_SOURCE == \"schedule\"",[190,513,377],{"class":196},[50,515,516,517,520],{},"Simple right ? The last piece is how you want the pipeline to run. Here as you can see, I decided to go for a ",[82,518,519],{},"schedule","\nthat you can add in your project's CI\u002FCD menu. But obviously, you could set it up on push, or on a manual action in the UI.",[45,522,524],{"id":523},"wait-how-do-i-update-my-server-now","Wait, how do I update my server now ?",[50,526,527],{},"Obviously, merging things on Gitlab doesn't make it, at least not at the automation level of watchtower. With it, I had nothing to\ndo. My goal is to have better control but not at the cost of laziness... Here again Gitlab will come to the rescue.",[87,529,531],{"id":530},"makefile-single-source-of-truth","Makefile - Single Source of Truth",[50,533,534,535,538,539,542],{},"I decided to use a ",[82,536,537],{},"Makefile"," as ",[82,540,541],{},"source of truth",", which is a common practice. It keeps the deployment configuration in\none place, making it easier to manage and maintain. Here's an example of what it might look like:",[181,544,548],{"className":545,"code":546,"language":547,"meta":186,"style":186},"language-makefile shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","SERVICES = traefik authentik nextcloud jellyfin # whatever you have...\n\npull:\n    @echo \"--- Synchronizing with GitLab (Hard Reset) ---\"\n    git fetch origin main\n    git reset --hard origin\u002Fmain\n\nupdate-%:\n    @echo \"--- Updating service: $* ---\"\n    @cd $* && docker compose pull -q\n    @cd $* && docker compose up -d --remove-orphans --wait\n","makefile",[54,549,550,555,559,564,569,574,579,583,588,593,598],{"__ignoreMap":186},[190,551,552],{"class":192,"line":193},[190,553,554],{},"SERVICES = traefik authentik nextcloud jellyfin # whatever you have...\n",[190,556,557],{"class":192,"line":200},[190,558,424],{"emptyLinePlaceholder":423},[190,560,561],{"class":192,"line":228},[190,562,563],{},"pull:\n",[190,565,566],{"class":192,"line":253},[190,567,568],{},"    @echo \"--- Synchronizing with GitLab (Hard Reset) ---\"\n",[190,570,571],{"class":192,"line":276},[190,572,573],{},"    git fetch origin main\n",[190,575,576],{"class":192,"line":299},[190,577,578],{},"    git reset --hard origin\u002Fmain\n",[190,580,581],{"class":192,"line":333},[190,582,424],{"emptyLinePlaceholder":423},[190,584,585],{"class":192,"line":433},[190,586,587],{},"update-%:\n",[190,589,590],{"class":192,"line":444},[190,591,592],{},"    @echo \"--- Updating service: $* ---\"\n",[190,594,595],{"class":192,"line":452},[190,596,597],{},"    @cd $* && docker compose pull -q\n",[190,599,600],{"class":192,"line":463},[190,601,602],{},"    @cd $* && docker compose up -d --remove-orphans --wait\n",[50,604,605,608,609,612],{},[82,606,607],{},"Note",": Using ",[54,610,611],{},"git reset --hard"," ensures the local state never drifts from the repo.",[50,614,615,616,619,620,623,624,626,627,630,631,634,635,638],{},"With this, updating a service is just a matter of running ",[54,617,618],{},"make pull"," and ",[54,621,622],{},"sudo make update-\u003Cservice_name>",".\nEasy, quick and efficient. After any update on a ",[54,625,56],{},", I have to ssh into the server and\nrun make commands. It is rather secure because I control the access to the server and the ssh ports\nare not open thus I can log in only ",[82,628,629],{},"locally"," via my ",[82,632,633],{},"ssh key",". My docker install is requiring ",[82,636,637],{},"sudo"," as well so at this point,\neverything seems fine.",[50,640,641,642,152],{},"But obviously, it's not automated ! And as I understood while trying to automate this process,\nthere are some challenges regarding ",[82,643,644],{},"security",[87,646,648],{"id":647},"the-trade-off-security-vs-convenience","The Trade-off: Security vs. Convenience",[50,650,651,652,657,658,661],{},"To further automate the updating process, I planned on using a ",[63,653,656],{"href":654,"rel":655},"https:\u002F\u002Fdocs.gitlab.com\u002Frunner\u002F",[67],"Gitlab runner","\ninstalled on the server that executes the ",[54,659,660],{},"make"," commands on my server.\nI'll explain the runner configurations later but let's first discuss the security implications of this approach.",[50,663,664,665,667],{},"Executing any docker command on my server requires ",[77,666,637],{},". This is a quite normal and secure configuration,\nbut it makes the automation process harder.",[50,669,670],{},"It seems there are essentially 3 options to mitigate this:",[672,673,674,688,694],"ol",{},[105,675,676,679,680,683,684,687],{},[82,677,678],{},"Run Docker commands as a non-root user",": adding the ",[54,681,682],{},"gitlab-runner"," ",[77,685,686],{},"user"," to the docker group and running Docker commands as that user.",[105,689,690,693],{},[82,691,692],{},"Run Docker in rootless mode",": running Docker without root privileges. This is a more secure option but requires\nsome additional configuration steps.",[105,695,696,699],{},[82,697,698],{},"Run Docker with root privileges",": running Docker normally, but allow the connection of the gitlab.com runner via ssh and handle secrets there.",[50,701,702,703,152],{},"This is always the same dilemma: ",[82,704,705],{},"Security vs. Convenience",[50,707,708,709,714],{},"Giving a user access to the Docker socket is generally considered a security risk, as it allows to execute\narbitrary commands with elevated privileges. For more details on the security risks see the\n",[63,710,713],{"href":711,"rel":712},"https:\u002F\u002Fdocs.docker.com\u002Fengine\u002Fsecurity\u002F#docker-daemon-attack-surface",[67],"Docker documentation",".\nIf the GitLab instance or the runner is compromised, the attacker has a path to the host in this case.",[50,716,717,720,721,724,725,728],{},[82,718,719],{},"Why I chose it anyway:"," For a homelab, I prioritized ",[82,722,723],{},"minimizing the network attack surface",".\nBy using a local shell runner, the server doesn't need to expose SSH to the wider network or manage external credentials.\nThe runner stays ",[77,726,727],{},"inside the house",", pulling instructions down from GitLab rather than having GitLab push them into my server.\nIt’s a calculated risk: I traded \"Internal Privilege Escalation Risk\" for \"External Network Exposure Risk.\"",[87,730,732],{"id":731},"gitlab-runners","Gitlab Runners",[50,734,735],{},"To bridge the gap between a \"Merge\" button on GitLab.com and your local terminal, you need a GitLab Runner acting\nas a local agent. Since we decided on a shell-based approach to trigger our Makefile, here is how to get it running:",[50,737,738,739,744],{},"First, install the runner on your server following the official ",[63,740,743],{"href":741,"rel":742},"https:\u002F\u002Fdocs.gitlab.com\u002Frunner\u002Finstall\u002F",[67],"GitLab documentation",".\nOnce installed, you need to link it to your project:",[181,746,750],{"className":747,"code":748,"language":749,"meta":186,"style":186},"language-Bash shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","sudo gitlab-runner register\n","Bash",[54,751,752],{"__ignoreMap":186},[190,753,754],{"class":192,"line":193},[190,755,748],{},[50,757,758],{},"During the prompt:",[102,760,761,769,772],{},[105,762,763,764,768],{},"GitLab Instance URL: this is ",[63,765,766],{"href":766,"rel":767},"https:\u002F\u002Fgitlab.com\u002F",[67]," (unless you are self-hosting GitLab).",[105,770,771],{},"Registration Token: Grab this from your project under Settings > CICD > Runners.",[105,773,774],{},"Executor: I chose shell. This allows the runner to execute commands directly on the host's terminal.",[50,776,777],{},"By default, the runner creates a system user named gitlab-runner. For our Makefile to work without manual intervention,\nthis user needs specific permissions. To allow the runner to manage containers without sudo, we need to add it to the docker group:",[181,779,781],{"className":747,"code":780,"language":749,"meta":186,"style":186},"sudo usermod -aG docker gitlab-runner\n",[54,782,783],{"__ignoreMap":186},[190,784,785],{"class":192,"line":193},[190,786,780],{},[50,788,789,790,793],{},"This is the ",[77,791,792],{},"unsafe"," decision we discussed earlier.",[50,795,796],{},"The runner also needs a workspace. While GitLab CI usually clones the repo into a temporary build folder, I also wanted\nto be able to manually deploy, just in case. I am managing the local git repository into my home folder, so I needed to grant\nsome access permission to the gitlab-runner user to my directories. This allows the runner to clone the repo and run the make commands.",[87,798,800],{"id":799},"the-workflow-in-action","The Workflow in Action",[50,802,803],{},"Now, the \"New Version\" flow looks like this:",[50,805,806],{},[156,807],{"alt":808,"src":809,"title":810},"GitOps Workflow","\u002Fposts\u002Fhomelab_automation\u002Fgitops_workflow.png","My *GitOps* workflow.",[672,812,813,819,826,831,842],{},[105,814,815,818],{},[82,816,817],{},"Renovate"," detects an update and opens a GitLab MR.",[105,820,821,822,825],{},"I review the ",[82,823,824],{},"Changelog"," directly in the MR description.",[105,827,828,829,152],{},"I click ",[82,830,151],{},[105,832,833,834,837,838,841],{},"The ",[82,835,836],{},"Local Shell Runner"," triggers a job that runs ",[54,839,840],{},"docker compose pull && docker compose up -d"," via make commands.",[105,843,844],{},"Everything is updated in seconds, with a full audit trail in Git.",[45,846,848],{"id":847},"conclusion","Conclusion",[50,850,851,852,855,856,858],{},"Moving to this setup felt a bit like ",[77,853,854],{},"growing up",". It’s not just about the automation; it’s about the peace of mind that comes with\na controlled process. The ability to roll back changes quickly and easily is invaluable.\nIf you are tired of your homelab breaking behind your back, stop using ",[54,857,60],{}," and start building a pipeline.\nYour future self (who just wants things to work) will thank you.",[50,860,861],{},"I'll definitely have a look at the docker executor gitlab runners. It could be the missing piece that ensures the complete\nsecurity of my setup. Stay tuned !",[863,864,865],"style",{},"html .light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html.light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html pre.shiki code .sMK4o, html code.shiki .sMK4o{--shiki-light:#39ADB5;--shiki-default:#89DDFF;--shiki-dark:#89DDFF}html pre.shiki code .spNyl, html code.shiki .spNyl{--shiki-light:#9C3EDA;--shiki-default:#C792EA;--shiki-dark:#C792EA}html pre.shiki code .sfazB, html code.shiki .sfazB{--shiki-light:#91B859;--shiki-default:#C3E88D;--shiki-dark:#C3E88D}html pre.shiki code .swJcz, html code.shiki .swJcz{--shiki-light:#E53935;--shiki-default:#F07178;--shiki-dark:#F07178}",{"title":186,"searchDepth":200,"depth":200,"links":867},[868,873,879],{"id":47,"depth":200,"text":48,"children":869},[870,871,872],{"id":89,"depth":228,"text":90},{"id":129,"depth":228,"text":130},{"id":163,"depth":228,"text":164},{"id":523,"depth":200,"text":524,"children":874},[875,876,877,878],{"id":530,"depth":228,"text":531},{"id":647,"depth":228,"text":648},{"id":731,"depth":228,"text":732},{"id":799,"depth":228,"text":800},{"id":847,"depth":200,"text":848},"2026-04-04","It seems everyone building a homelab goes through this phase. This is my take on homelab automation, extensively using gitlab CI\u002FCD powers and Renovate bot.","md","\u002Fposts\u002Fhomelab_automation\u002Ffeatured.svg",{},null,{"title":18,"description":881},[888,889,890,891,892,893,269,817],"gitlab","automation","homelab","docker","CI\u002FCD","gitOps","5-Y4mmUlac5gu-rMPS_zZA7QspZ6sNvuGxCyiJo3kOM",{"id":896,"title":10,"body":897,"date":2050,"description":2051,"extension":882,"image":2052,"meta":2053,"navigation":423,"path":11,"readingTime":885,"seo":2054,"stem":12,"tags":2055,"__hash__":2063},"content\u002Fposts\u002Fagents_mcp_rag_local_foss.md",{"type":42,"value":898,"toc":2028},[899,903,906,910,917,924,927,931,934,945,952,960,964,967,981,984,1004,1008,1011,1025,1028,1031,1035,1038,1045,1048,1052,1061,1065,1069,1078,1081,1085,1088,1092,1096,1099,1104,1111,1114,1117,1120,1123,1140,1143,1151,1154,1222,1225,1229,1232,1235,1238,1384,1387,1390,1405,1409,1432,1443,1466,1473,1768,1771,1775,1778,1784,1824,1831,1834,1872,1879,1969,1976,1979,1985,1988,1992,2001,2004,2011,2013,2022,2025],[45,900,902],{"id":901},"some-definitions","Some Definitions",[50,904,905],{},"Let's start with some context and definitions about what we are going to use.",[87,907,909],{"id":908},"agentic-ai","Agentic AI",[50,911,912,913,916],{},"Agentic AI refers to intelligent ",[82,914,915],{},"Agents"," that can perceive their environment, plan actions, and execute them autonomously to achieve high‑level goals.\nUnlike a simple prompt‑response model, an agent has the ability to plan ahead sub-tasks to perform and call adequate tools to achieve them.\nDepending on its setup, it can browse the web, analyze data, call APIs, run code, query databases. You can also include a Human-in-the-Loop to control\nthe agent actions.",[50,918,919],{},[156,920],{"alt":921,"src":922,"title":923},"Basic Agent Workflow","\u002Fposts\u002Fagents_mcp_rag_local_foss\u002Fagent_flow.png","A basic agent workflow.",[50,925,926],{},"Roughly, the LLM acts as the brain of the Agent, planning and taking decisions while the tools are its mean to interact with its environment.",[87,928,930],{"id":929},"local-llm-inference","Local LLM inference",[50,932,933],{},"Local LLM inference means running a large language model directly on your own hardware (CPU, GPU) instead of sending data to the cloud.\nThe main benefits are:",[102,935,936,939,942],{},[105,937,938],{},"No network round‑trips for every query.",[105,940,941],{},"Data privacy – no sensitive text leaves your premises.",[105,943,944],{},"Cost efficiency – you don't have to pay for each query.",[50,946,947],{},[156,948],{"alt":949,"src":950,"title":951},"Ollama Logo and Name","\u002Fposts\u002Fagents_mcp_rag_local_foss\u002Follama_name.svg","Ollama is a tool to run LLMs locally.",[50,953,954,955,959],{},"I discussed how to set up Ollama in a previous ",[63,956,958],{"href":957},".\u002Fself_host_your_ai_assistant","post",". Obviously, the performance is limited by\nyour hardware but for experimenting it is a good choice. Other options would be to use free tiers of inference providers, or paid ones.\nHere, huggingFace shines again as it provides a simple way to call inference endpoints with various models they propose.\nYou also can connect to you favorite main service like OpenAI or Anthropic. All of it is explained in the tutorials from HuggingFace courses.",[87,961,963],{"id":962},"model-context-protocol-mcp","Model Context Protocol (MCP)",[50,965,966],{},"The Model Context Protocol (MCP) is a set of guidelines that standardize how external data (e.g., APIs, databases, files) is exposed\nto an LLM as context. Think of it as a contract that defines:",[102,968,969,972,975,978],{},[105,970,971],{},"Context schema – JSON schema for the data structure.",[105,973,974],{},"Metadata – Provenance, freshness, and privacy tags.",[105,976,977],{},"Access patterns – How to retrieve, cache, and stream the data.",[105,979,980],{},"Security controls – Token scopes and rate limits.",[50,982,983],{},"MCP enables plug‑and‑play context for agents: the same agent can query a weather API, a CSV file, or a Neo4j graph, all through a uniform interface.\nIt acts as an accelerator for the development of agents as any set of tools can be implemented independently of the AI models used to power them.",[50,985,986,987,990,991,994,995,998,999],{},"To explain in simple words how it works, let's say that the tools are exposed by an MCP ",[77,988,989],{},"server",", your application or agent is called a ",[77,992,993],{},"host"," and this host\nimplements an MPC ",[77,996,997],{},"client"," as one of its functionalities. You can find a more in-depth explanation in this\n",[63,1000,1003],{"href":1001,"rel":1002},"https:\u002F\u002Fhuggingface.co\u002Flearn\u002Fmcp-course\u002Funit1\u002Fkey-concepts",[67],"course",[87,1005,1007],{"id":1006},"retrieval-augmented-generation-rag","Retrieval Augmented Generation (RAG)",[50,1009,1010],{},"Retrieval Augmented Generation is a technique that augments the LLM’s answer with external knowledge fetched in real time. The typical pipeline is:",[102,1012,1013,1016,1019,1022],{},[105,1014,1015],{},"Query formulation – The agent generates a search query or a prompt.",[105,1017,1018],{},"Retrieval – The system looks up the best relevant passages from a vector store, knowledge graph, or web search.",[105,1020,1021],{},"Fusion – The retrieved snippets are concatenated with the prompt.",[105,1023,1024],{},"Generation – The LLM produces the final answer.",[50,1026,1027],{},"RAG is especially useful when dealing with highly specialized domains or rapidly changing data that the trained LLM would not have seen.",[50,1029,1030],{},"A quick word about Agentic RAG here. In the traditional RAG implementation, the tool for retrieval is called first, and it's answer is then passed\nas additional context to the LLM along with the initial use query. In agentic RAG, the workflow is more flexible because the agent will decide\nwhether to use the retriever tool. Another big advantage is the query reformulation that the agent performs. In traditional RAG,\nthe initial user query, often time a complete question, is generally passed as-is to the retriever, which is not optimal. In agentic RAG,\nthe agent LLM reformulates the query before passing it to the retriever improving the results.",[87,1032,1034],{"id":1033},"vector-stores","Vector Stores",[50,1036,1037],{},"Vector Stores are a foundation of RAG implementation. Their role is it turn raw knowledge into searchable embeddings,\nenabling RAG systems to use specific information without having to load entire documents into memory.",[50,1039,1040],{},[156,1041],{"alt":1042,"src":1043,"title":1044},"Basic RAG Workflow with a vector Store","\u002Fposts\u002Fagents_mcp_rag_local_foss\u002FRAG.png","A basic RAG workflow using embeddings and vector stores.",[50,1046,1047],{},"In general, the input document are split into smaller chunks to turns a monolithic document into a collection of searchable,\nsemantically coherent units that fit within the LLM’s context window and that can be ranked accurately",[87,1049,1051],{"id":1050},"knowledge-graphs","Knowledge Graphs",[50,1053,1054,1055,1060],{},"A knowledge graph is a graph‑structured representation of data with nodes and edges representing entities and their relationships.\nStored in a graph database like ",[63,1056,1059],{"href":1057,"rel":1058},"https:\u002F\u002Fneo4j.com\u002F",[67],"Neo4J",", a knowledge graph supports efficient traversal, semantic search, and query languages\nenabling agents to answer complex questions, discover hidden connections, and supply structured evidence.",[45,1062,1064],{"id":1063},"where-to-start-where-to-learn","Where to start, where to learn ?",[87,1066,1068],{"id":1067},"huggingface-learn","HuggingFace Learn",[50,1070,1071,1072,1077],{},"When looking for good tutorials to begin with, I came upon the HuggingFace hub ",[63,1073,1076],{"href":1074,"rel":1075},"https:\u002F\u002Fhuggingface.co\u002Flearn",[67],"Learn"," section.\nYou can find there complete lessons and tutorials on LLM usage and training,\nAgentic frameworks like SmolAgent and Langchain, and even an MCP course built just a few months ago as when I'm writing this.\nI found these courses complete and well-written with hands-on exercises that you can run either using their inference\noptions (beware the costs) or locally with your Ollama instance.",[50,1079,1080],{},"That being said, you can use whatever material you prefer. I often find YouTube videos to be of great use because they showcase\nsimple use cases that you can duplicate.",[87,1082,1084],{"id":1083},"diy-is-always-a-good-choice","DIY is always a good choice",[50,1086,1087],{},"Nothing compares to practice when trying to learn and understand how things work. You could spend hours reading courses materials\nand videos online but implementing them yourself, even simple version designed for dummy use cases will give you a much better understanding.\nI decided to give a try at building my own agent using Langchain and Ollama.",[45,1089,1091],{"id":1090},"my-experimental-agent-step-by-step","My Experimental Agent step by step",[87,1093,1095],{"id":1094},"agentic-framework","Agentic Framework",[50,1097,1098],{},"Fortunately, we don't have to reinvent the wheel. There are multiple open source AI Agent frameworks available, all with their pros and cons.\nI tested out two of them, smolagents from Huggingface and Langchain\u002FLanggraph.",[1100,1101],"repo",{":show-thumbnail":473,"platform":1102,"repo":1103},"github","huggingface\u002Fsmolagents",[50,1105,1106,1107,1110],{},"The first one focuses mainly on ",[82,1108,1109],{},"Code Agents",", which are designed to be able to generate and execute code as they advance\nthrough their answering steps. It makes them really powerful but more unpredictable.",[1100,1112],{":show-thumbnail":473,"platform":1102,"repo":1113},"langchain-ai\u002Flanggraph",[50,1115,1116],{},"On the other side of the spectrum, there is Langgraph. Its graph based approach allows for more control over the agent workflow,\nbut it requires more setup and understanding. It is also an established framework among professionals, the control features\nbeing most relevant in business and enterprise use cases.",[50,1118,1119],{},"I decided to go with Langgraph as I thought to get more value out of my training, but ultimately I used both as you will see later.\nAnyway, if you're like me trying to experiment, I'll recommend to try both !",[50,1121,1122],{},"In langgraph, you have to define a state, nodes and edges to build the graph of your agent. The state is the content of your application that is\npassed through the graph nodes via the edges. For my use case, the state is the complete history of messages built by the different nodes. It consists\nin a series of Human, AI and Tool messages.",[181,1124,1128],{"className":1125,"code":1126,"language":1127,"meta":186,"style":186},"language-python shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","class AgentState(TypedDict):\n    messages: Annotated[list[AnyMessage], add_messages]\n","python",[54,1129,1130,1135],{"__ignoreMap":186},[190,1131,1132],{"class":192,"line":193},[190,1133,1134],{},"class AgentState(TypedDict):\n",[190,1136,1137],{"class":192,"line":200},[190,1138,1139],{},"    messages: Annotated[list[AnyMessage], add_messages]\n",[50,1141,1142],{},"For the first implementation, I'm using two nodes:",[102,1144,1145,1148],{},[105,1146,1147],{},"Assistant node - The LLM brain of the agent, it is defined as the start of the flow.",[105,1149,1150],{},"Tool node - the toolbox containing all the tools and their metadata.",[50,1152,1153],{},"The only specific edge is the link between assistant and tool nodes. It is a conditional relation activate by the LLM.",[181,1155,1157],{"className":1125,"code":1156,"language":1127,"meta":186,"style":186},"builder = StateGraph(AgentState)\nmemory = InMemorySaver()\n\nbuilder.add_node(\"assistant\", assistant)\nbuilder.add_node(\"tools\", ToolNode(tools))\n\nbuilder.add_edge(START, \"assistant\")\nbuilder.add_conditional_edges(\n    \"assistant\",\n    tools_condition,\n)\nbuilder.add_edge(\"tools\", \"assistant\")\nagent = builder.compile(checkpointer=memory)\n",[54,1158,1159,1164,1169,1173,1178,1183,1187,1192,1197,1202,1207,1212,1217],{"__ignoreMap":186},[190,1160,1161],{"class":192,"line":193},[190,1162,1163],{},"builder = StateGraph(AgentState)\n",[190,1165,1166],{"class":192,"line":200},[190,1167,1168],{},"memory = InMemorySaver()\n",[190,1170,1171],{"class":192,"line":228},[190,1172,424],{"emptyLinePlaceholder":423},[190,1174,1175],{"class":192,"line":253},[190,1176,1177],{},"builder.add_node(\"assistant\", assistant)\n",[190,1179,1180],{"class":192,"line":276},[190,1181,1182],{},"builder.add_node(\"tools\", ToolNode(tools))\n",[190,1184,1185],{"class":192,"line":299},[190,1186,424],{"emptyLinePlaceholder":423},[190,1188,1189],{"class":192,"line":333},[190,1190,1191],{},"builder.add_edge(START, \"assistant\")\n",[190,1193,1194],{"class":192,"line":433},[190,1195,1196],{},"builder.add_conditional_edges(\n",[190,1198,1199],{"class":192,"line":444},[190,1200,1201],{},"    \"assistant\",\n",[190,1203,1204],{"class":192,"line":452},[190,1205,1206],{},"    tools_condition,\n",[190,1208,1209],{"class":192,"line":463},[190,1210,1211],{},")\n",[190,1213,1214],{"class":192,"line":479},[190,1215,1216],{},"builder.add_edge(\"tools\", \"assistant\")\n",[190,1218,1219],{"class":192,"line":490},[190,1220,1221],{},"agent = builder.compile(checkpointer=memory)\n",[50,1223,1224],{},"In this sample code, I also added a memory checkpointer to allow the LLM to remember interactions, like you would expect in\na chat application. It also allows to follow up queries with additional information.",[87,1226,1228],{"id":1227},"mcp-tools-integration","MCP Tools integration",[50,1230,1231],{},"After their release late 2024, MCP have been the center of the AI world. I figured that while I was experimenting with LLMs tools\nand agents, I'd take a look at MCP servers and clients along the way. Obviously, for the little experiment I'm building, I didn't need\nany of this more complex setup.",[1100,1233],{":show-thumbnail":473,"platform":1102,"repo":1234},"modelcontextprotocol\u002Fpython-sdk",[50,1236,1237],{},"Tools are the mean of interaction for the agent. For a tool to be efficient, it needs to have a precise documentation and typed inputs.\nIn the python version of the MCP server I used, all this is done trough typing and docstrings. Apart from that the MCP implementation\nis quite simple. Look at this math MCP server:",[181,1239,1241],{"className":1125,"code":1240,"language":1127,"meta":186,"style":186},"from mcp.server.fastmcp import FastMCP\n\nmcp = FastMCP(\"Math\")\n\n\n@mcp.tool()\ndef multiply(a: float, b: float) -> float:\n    \"\"\"\n    Multiplies two numbers.\n    Args:\n        a (float): the first number\n        b (float): the second number\n    \"\"\"\n    return a * b\n\n\n@mcp.tool()\ndef add(a: float, b: float) -> float:\n    \"\"\"\n    Adds two numbers.\n    Args:\n        a (float): the first number\n        b (float): the second number\n    \"\"\"\n    return a + b\n\nif __name__ == \"__main__\":\n    mcp.run(transport=\"stdio\")\n",[54,1242,1243,1248,1252,1257,1261,1265,1270,1275,1280,1285,1290,1295,1300,1304,1309,1314,1319,1324,1330,1335,1341,1346,1351,1356,1361,1367,1372,1378],{"__ignoreMap":186},[190,1244,1245],{"class":192,"line":193},[190,1246,1247],{},"from mcp.server.fastmcp import FastMCP\n",[190,1249,1250],{"class":192,"line":200},[190,1251,424],{"emptyLinePlaceholder":423},[190,1253,1254],{"class":192,"line":228},[190,1255,1256],{},"mcp = FastMCP(\"Math\")\n",[190,1258,1259],{"class":192,"line":253},[190,1260,424],{"emptyLinePlaceholder":423},[190,1262,1263],{"class":192,"line":276},[190,1264,424],{"emptyLinePlaceholder":423},[190,1266,1267],{"class":192,"line":299},[190,1268,1269],{},"@mcp.tool()\n",[190,1271,1272],{"class":192,"line":333},[190,1273,1274],{},"def multiply(a: float, b: float) -> float:\n",[190,1276,1277],{"class":192,"line":433},[190,1278,1279],{},"    \"\"\"\n",[190,1281,1282],{"class":192,"line":444},[190,1283,1284],{},"    Multiplies two numbers.\n",[190,1286,1287],{"class":192,"line":452},[190,1288,1289],{},"    Args:\n",[190,1291,1292],{"class":192,"line":463},[190,1293,1294],{},"        a (float): the first number\n",[190,1296,1297],{"class":192,"line":479},[190,1298,1299],{},"        b (float): the second number\n",[190,1301,1302],{"class":192,"line":490},[190,1303,1279],{},[190,1305,1306],{"class":192,"line":498},[190,1307,1308],{},"    return a * b\n",[190,1310,1312],{"class":192,"line":1311},15,[190,1313,424],{"emptyLinePlaceholder":423},[190,1315,1317],{"class":192,"line":1316},16,[190,1318,424],{"emptyLinePlaceholder":423},[190,1320,1322],{"class":192,"line":1321},17,[190,1323,1269],{},[190,1325,1327],{"class":192,"line":1326},18,[190,1328,1329],{},"def add(a: float, b: float) -> float:\n",[190,1331,1333],{"class":192,"line":1332},19,[190,1334,1279],{},[190,1336,1338],{"class":192,"line":1337},20,[190,1339,1340],{},"    Adds two numbers.\n",[190,1342,1344],{"class":192,"line":1343},21,[190,1345,1289],{},[190,1347,1349],{"class":192,"line":1348},22,[190,1350,1294],{},[190,1352,1354],{"class":192,"line":1353},23,[190,1355,1299],{},[190,1357,1359],{"class":192,"line":1358},24,[190,1360,1279],{},[190,1362,1364],{"class":192,"line":1363},25,[190,1365,1366],{},"    return a + b\n",[190,1368,1370],{"class":192,"line":1369},26,[190,1371,424],{"emptyLinePlaceholder":423},[190,1373,1375],{"class":192,"line":1374},27,[190,1376,1377],{},"if __name__ == \"__main__\":\n",[190,1379,1381],{"class":192,"line":1380},28,[190,1382,1383],{},"    mcp.run(transport=\"stdio\")\n",[50,1385,1386],{},"You can define as many MCP servers as you need, I created a math one, one for fetching the weather, one for searching the web, ...\nOne of the advantages of this MCP setup is the re-usability of these servers. You implement a tool once, and then you can use them in\nas many agents or applications you want.",[50,1388,1389],{},"You can also use publicly available MCP servers from the web, like the ones from GitHub or Google to be able to interact with their services.\nIt is a formidable way to build quickly well-integrated toolboxes but the downside is the potential transit of data to remote servers.",[50,1391,1392,1393,1396,1397,1400,1401,1404],{},"To ",[77,1394,1395],{},"connect"," you agent LLM node to the different MCP servers, I used the provided ",[54,1398,1399],{},"MultiServerMCPClient"," from ",[54,1402,1403],{},"langchain_mcp_adapters"," and bind\ntools methods.",[87,1406,1408],{"id":1407},"adding-data-analysis-capabilities-with-code-agent","Adding Data Analysis Capabilities with Code Agent",[50,1410,1411,1412,1415,1416,1419,1420,1425,1426,1431],{},"Until now, everything we did was based on simple tools with no ",[77,1413,1414],{},"real"," added value. Add the ability to browse the web with ",[54,1417,1418],{},"DuckDuckGoSearchRun"," from langchain\nis certainly useful to overcome to lack of up-to-date data of the LLMs, but you can have this now in any GUI interface like\n",[63,1421,1424],{"href":1422,"rel":1423},"https:\u002F\u002Fgithub.com\u002Fn4ze3m\u002Fpage-assist",[67],"PageAssist"," or ",[63,1427,1430],{"href":1428,"rel":1429},"https:\u002F\u002Fopenwebui.com\u002F",[67],"OpenWebUI",". What can I add to my agent that could be really useful to me ?",[50,1433,1434,1435,1438,1439,1442],{},"I'm a Data Scientist, and as such, most of my time is spent ",[77,1436,1437],{},"analyzing"," data. It means, exploring, computing statistics and metrics and visualizing them.\nIt is a repetitive task but not so simple to automate because for each use case, the exploration will be different. What if I could ask an agent to ",[77,1440,1441],{},"explore"," a dataset ?\nIt would mean load let's say a CSV file, compute some statistics, draw charts and reflect on the outputs.",[50,1444,1445,1446,1448,1449,1454,1455,1460,1461,152],{},"Now, I said earlier that the ",[77,1447,1076],{}," section from HuggingFace was an excellent starting point for the journey. It's even better, it's a gold mine !\nThe ",[63,1450,1453],{"href":1451,"rel":1452},"https:\u002F\u002Fhuggingface.co\u002Flearn\u002Fcookbook\u002Findex",[67],"Open-Source AI Cookbook"," contains a lot of recipes that can be adapted to your needs.\nIt's also a lot of inspiration on the possible applications of LLMs and Agents. I found implementations of an\n",[63,1456,1459],{"href":1457,"rel":1458},"https:\u002F\u002Fhuggingface.co\u002Flearn\u002Fcookbook\u002Fagent_data_analyst",[67],"Analytics Assistant"," and a ",[63,1462,1465],{"href":1463,"rel":1464},"https:\u002F\u002Fhuggingface.co\u002Flearn\u002Fcookbook\u002Frag_with_knowledge_graphs_neo4j",[67],"Knowledge Graph RAG",[50,1467,1468,1469,1472],{},"I decided to follow the implementation of the Data Analysis Agent using a smolagent ",[82,1470,1471],{},"CodeAgent"," and expose its analysis capabilities as a tool to my manager agent.\nI am not sure that this architecture is a recommended one, but it will do for the time being. Code Agents as I mentioned have the ability to\nexecute code snippets while progressing through their tasks. I think of it as an agent that can build his own tools, at least simple ones.\nLet's have a look at my version:",[181,1474,1476],{"className":1125,"code":1475,"language":1127,"meta":186,"style":186},"from mcp.server.fastmcp import FastMCP\nfrom smolagents import CodeAgent, LiteLLMModel\n\nmcp = FastMCP(\"Data Analysis\")\n\nmodel = LiteLLMModel(\n    model_id=\"ollama_chat\u002Fyour_model_name\",\n    api_base=OLLAMA_HOST,\n    num_ctx=8192,\n)\n\nagent = CodeAgent(\n    tools=[],\n    model=model,\n    additional_authorized_imports=[\n        \"numpy\", \"pandas\", \"matplotlib.pyplot\", \"seaborn\"],\n)\n\n@mcp.tool()\ndef run_analysis(additional_notes: str, source_file: str) -> str:\n    \"\"\"Analyses the content of a given csv file.\n    Args:\n        additional_notes (str): notes to guide the analysis\n        source_file (str): path to local source file\n\n    \"\"\"\n    prompt = f\"\"\"You are an expert data analyst.\n        Please load the source file and analyze its content.\n        \n        The first analysis to perform is a generic content exploration, \n        with simple statistics, null values, outliers, and types \n        of each columns.\n        \n        Secondly, according to the variables you have, list 3 \n        interesting questions that could be asked on this data, \n        for instance about specific correlations.\n        Then answer these questions one by one, by finding the \n        relevant numbers. Meanwhile, plot some figures using \n        matplotlib\u002Fseaborn and save them to the (already existing) \n        folder '.\u002Ffigures\u002F': take care to clear each figure \n        with plt.clf() before doing another plot.\n        \n        In your final answer: summarize the initial analysis and \n        these correlations and trends. After each number derive \n        real worlds insights. Your final answer should have at \n        least 3 numbered and detailed parts.\n        \n        - Here are additional notes and query to guide \n          your analysis: {additional_notes}.\n        - Here is the file path: {source_file}.\n        \"\"\"\n\n    return agent.run(prompt)\n\nif __name__ == \"__main__\":\n    mcp.run(transport=\"stdio\")\n",[54,1477,1478,1482,1487,1491,1496,1500,1505,1510,1515,1520,1524,1528,1533,1538,1543,1548,1553,1557,1561,1565,1570,1575,1579,1584,1589,1593,1597,1602,1607,1613,1619,1625,1631,1636,1642,1648,1654,1660,1666,1672,1678,1684,1689,1695,1701,1707,1713,1718,1724,1730,1736,1742,1747,1753,1758,1763],{"__ignoreMap":186},[190,1479,1480],{"class":192,"line":193},[190,1481,1247],{},[190,1483,1484],{"class":192,"line":200},[190,1485,1486],{},"from smolagents import CodeAgent, LiteLLMModel\n",[190,1488,1489],{"class":192,"line":228},[190,1490,424],{"emptyLinePlaceholder":423},[190,1492,1493],{"class":192,"line":253},[190,1494,1495],{},"mcp = FastMCP(\"Data Analysis\")\n",[190,1497,1498],{"class":192,"line":276},[190,1499,424],{"emptyLinePlaceholder":423},[190,1501,1502],{"class":192,"line":299},[190,1503,1504],{},"model = LiteLLMModel(\n",[190,1506,1507],{"class":192,"line":333},[190,1508,1509],{},"    model_id=\"ollama_chat\u002Fyour_model_name\",\n",[190,1511,1512],{"class":192,"line":433},[190,1513,1514],{},"    api_base=OLLAMA_HOST,\n",[190,1516,1517],{"class":192,"line":444},[190,1518,1519],{},"    num_ctx=8192,\n",[190,1521,1522],{"class":192,"line":452},[190,1523,1211],{},[190,1525,1526],{"class":192,"line":463},[190,1527,424],{"emptyLinePlaceholder":423},[190,1529,1530],{"class":192,"line":479},[190,1531,1532],{},"agent = CodeAgent(\n",[190,1534,1535],{"class":192,"line":490},[190,1536,1537],{},"    tools=[],\n",[190,1539,1540],{"class":192,"line":498},[190,1541,1542],{},"    model=model,\n",[190,1544,1545],{"class":192,"line":1311},[190,1546,1547],{},"    additional_authorized_imports=[\n",[190,1549,1550],{"class":192,"line":1316},[190,1551,1552],{},"        \"numpy\", \"pandas\", \"matplotlib.pyplot\", \"seaborn\"],\n",[190,1554,1555],{"class":192,"line":1321},[190,1556,1211],{},[190,1558,1559],{"class":192,"line":1326},[190,1560,424],{"emptyLinePlaceholder":423},[190,1562,1563],{"class":192,"line":1332},[190,1564,1269],{},[190,1566,1567],{"class":192,"line":1337},[190,1568,1569],{},"def run_analysis(additional_notes: str, source_file: str) -> str:\n",[190,1571,1572],{"class":192,"line":1343},[190,1573,1574],{},"    \"\"\"Analyses the content of a given csv file.\n",[190,1576,1577],{"class":192,"line":1348},[190,1578,1289],{},[190,1580,1581],{"class":192,"line":1353},[190,1582,1583],{},"        additional_notes (str): notes to guide the analysis\n",[190,1585,1586],{"class":192,"line":1358},[190,1587,1588],{},"        source_file (str): path to local source file\n",[190,1590,1591],{"class":192,"line":1363},[190,1592,424],{"emptyLinePlaceholder":423},[190,1594,1595],{"class":192,"line":1369},[190,1596,1279],{},[190,1598,1599],{"class":192,"line":1374},[190,1600,1601],{},"    prompt = f\"\"\"You are an expert data analyst.\n",[190,1603,1604],{"class":192,"line":1380},[190,1605,1606],{},"        Please load the source file and analyze its content.\n",[190,1608,1610],{"class":192,"line":1609},29,[190,1611,1612],{},"        \n",[190,1614,1616],{"class":192,"line":1615},30,[190,1617,1618],{},"        The first analysis to perform is a generic content exploration, \n",[190,1620,1622],{"class":192,"line":1621},31,[190,1623,1624],{},"        with simple statistics, null values, outliers, and types \n",[190,1626,1628],{"class":192,"line":1627},32,[190,1629,1630],{},"        of each columns.\n",[190,1632,1634],{"class":192,"line":1633},33,[190,1635,1612],{},[190,1637,1639],{"class":192,"line":1638},34,[190,1640,1641],{},"        Secondly, according to the variables you have, list 3 \n",[190,1643,1645],{"class":192,"line":1644},35,[190,1646,1647],{},"        interesting questions that could be asked on this data, \n",[190,1649,1651],{"class":192,"line":1650},36,[190,1652,1653],{},"        for instance about specific correlations.\n",[190,1655,1657],{"class":192,"line":1656},37,[190,1658,1659],{},"        Then answer these questions one by one, by finding the \n",[190,1661,1663],{"class":192,"line":1662},38,[190,1664,1665],{},"        relevant numbers. Meanwhile, plot some figures using \n",[190,1667,1669],{"class":192,"line":1668},39,[190,1670,1671],{},"        matplotlib\u002Fseaborn and save them to the (already existing) \n",[190,1673,1675],{"class":192,"line":1674},40,[190,1676,1677],{},"        folder '.\u002Ffigures\u002F': take care to clear each figure \n",[190,1679,1681],{"class":192,"line":1680},41,[190,1682,1683],{},"        with plt.clf() before doing another plot.\n",[190,1685,1687],{"class":192,"line":1686},42,[190,1688,1612],{},[190,1690,1692],{"class":192,"line":1691},43,[190,1693,1694],{},"        In your final answer: summarize the initial analysis and \n",[190,1696,1698],{"class":192,"line":1697},44,[190,1699,1700],{},"        these correlations and trends. After each number derive \n",[190,1702,1704],{"class":192,"line":1703},45,[190,1705,1706],{},"        real worlds insights. Your final answer should have at \n",[190,1708,1710],{"class":192,"line":1709},46,[190,1711,1712],{},"        least 3 numbered and detailed parts.\n",[190,1714,1716],{"class":192,"line":1715},47,[190,1717,1612],{},[190,1719,1721],{"class":192,"line":1720},48,[190,1722,1723],{},"        - Here are additional notes and query to guide \n",[190,1725,1727],{"class":192,"line":1726},49,[190,1728,1729],{},"          your analysis: {additional_notes}.\n",[190,1731,1733],{"class":192,"line":1732},50,[190,1734,1735],{},"        - Here is the file path: {source_file}.\n",[190,1737,1739],{"class":192,"line":1738},51,[190,1740,1741],{},"        \"\"\"\n",[190,1743,1745],{"class":192,"line":1744},52,[190,1746,424],{"emptyLinePlaceholder":423},[190,1748,1750],{"class":192,"line":1749},53,[190,1751,1752],{},"    return agent.run(prompt)\n",[190,1754,1756],{"class":192,"line":1755},54,[190,1757,424],{"emptyLinePlaceholder":423},[190,1759,1761],{"class":192,"line":1760},55,[190,1762,1377],{},[190,1764,1766],{"class":192,"line":1765},56,[190,1767,1383],{},[50,1769,1770],{},"As you can see, the tool is just a prompt where you ask the agent to analyze the given csv file ! The first few tests I have made\nwith this version are quite impressive already ! It created visualizations, metrics and reflected to get insights in its final answer.\nThis first version could be improved with dedicated tools like plots and given metrics to increase the control on what the agent\nis going to achieve when we call it.",[87,1772,1774],{"id":1773},"using-knowledge-graphs-to-enhance-rag","Using Knowledge Graphs to enhance RAG",[50,1776,1777],{},"I gave a shot at Vector Stores and traditional RAG a few months back when first hearing about the technique. The idea is to improve\nthe quality of answers from an LLM using specific data, either more recent data or dedicated to a certain domain for example.\nThe typical example for me is a coding assistant capable of searching through the documentation of a specific language or package.\nAnother technique to achieve this would be finetuning but in the case of LLMs, the constraints are quite hard making RAG a good alternative.",[50,1779,1780,1781,1783],{},"More recently, ",[82,1782,1051],{}," (KG) have been introduced as a way to improve the LLM answers in the case of semantic searches by\nadding contextual understanding of the data. It also gives a way to better explain the reasoning made by the LLM.",[50,1785,833,1786,1790,1791,1794,1795,1800,1801,1804,1805,619,1808,1811,1812,619,1815,1818,1819,152],{},[63,1787,1789],{"href":1463,"rel":1788},[67],"recipe"," from HuggingFace is using ",[63,1792,1059],{"href":1057,"rel":1793},[67]," as\nthe graph database. I am using the docker version of Neo4J to host my sample database but there is free plan for hosting on ",[63,1796,1799],{"href":1797,"rel":1798},"https:\u002F\u002Fneo4j.com\u002Fproduct\u002Fauradb\u002F",[67],"Neo4J AuraDB","\nI'm using the proposed dataset as a base for the sake of the experiment.\nA graph containing ",[77,1802,1803],{},"Articles",", ",[77,1806,1807],{},"Authors",[77,1809,1810],{},"Topics"," nodes with edges building the relation between them: ",[77,1813,1814],{},"published by",[77,1816,1817],{},"in topic",".\nIt is representative of a research AI assistant, with for example a database derived from ",[63,1820,1823],{"href":1821,"rel":1822},"https:\u002F\u002Farxiv.org\u002F",[67],"Arxiv",[50,1825,1826],{},[156,1827],{"alt":1828,"src":1829,"title":1830},"Neo4J Logo and Name","\u002Fposts\u002Fagents_mcp_rag_local_foss\u002Fneo4j-ar21.svg","Neo4J is one of the many option to build a Graph Database.",[50,1832,1833],{},"First load the Neo4J graph:",[181,1835,1837],{"className":1125,"code":1836,"language":1127,"meta":186,"style":186},"from langchain_community.graphs import Neo4jGraph\n\ngraph = Neo4jGraph(\n    url=os.environ[\"NEO4J_URI\"],\n    username=os.environ[\"NEO4J_USERNAME\"],\n    password=os.environ[\"NEO4J_PASSWORD\"],\n)\n",[54,1838,1839,1844,1848,1853,1858,1863,1868],{"__ignoreMap":186},[190,1840,1841],{"class":192,"line":193},[190,1842,1843],{},"from langchain_community.graphs import Neo4jGraph\n",[190,1845,1846],{"class":192,"line":200},[190,1847,424],{"emptyLinePlaceholder":423},[190,1849,1850],{"class":192,"line":228},[190,1851,1852],{},"graph = Neo4jGraph(\n",[190,1854,1855],{"class":192,"line":253},[190,1856,1857],{},"    url=os.environ[\"NEO4J_URI\"],\n",[190,1859,1860],{"class":192,"line":276},[190,1861,1862],{},"    username=os.environ[\"NEO4J_USERNAME\"],\n",[190,1864,1865],{"class":192,"line":299},[190,1866,1867],{},"    password=os.environ[\"NEO4J_PASSWORD\"],\n",[190,1869,1870],{"class":192,"line":333},[190,1871,1211],{},[50,1873,1874,1875,1878],{},"In the case of a graph database, langchain provides a ",[54,1876,1877],{},"GraphCypherQAChain"," that allows us to query our graph database using natural language.\nLike in the case of the Data Analytics Assistant, the queries are handled by a dedicated agent, here from langgraph, with its own set of tools and instructions.",[181,1880,1882],{"className":1125,"code":1881,"language":1127,"meta":186,"style":186},"cypher_chain = GraphCypherQAChain.from_llm(\n    cypher_llm=ChatOllama(model = \"a_local_model\", temperature=0.),\n    qa_llm=ChatOllama(model = \"a_local_model\", temperature=0.),\n    graph=graph,\n    verbose=True,\n    allow_dangerous_requests=True, # should add control in real world\n)\n\ndef graph_retriever(query: str) -> str:\n    return cypher_chain.invoke({\"query\": query})\n\ngraph_retriever_tool = Tool(\n    name=\"graph_retriever_tool\",\n    func=graph_retriever,\n    description=\"\"\"Retrieves detailed information about \n    articles, authors and topics from graph database.\n    \"\"\"\n)\n",[54,1883,1884,1889,1894,1899,1904,1909,1914,1918,1922,1927,1932,1936,1941,1946,1951,1956,1961,1965],{"__ignoreMap":186},[190,1885,1886],{"class":192,"line":193},[190,1887,1888],{},"cypher_chain = GraphCypherQAChain.from_llm(\n",[190,1890,1891],{"class":192,"line":200},[190,1892,1893],{},"    cypher_llm=ChatOllama(model = \"a_local_model\", temperature=0.),\n",[190,1895,1896],{"class":192,"line":228},[190,1897,1898],{},"    qa_llm=ChatOllama(model = \"a_local_model\", temperature=0.),\n",[190,1900,1901],{"class":192,"line":253},[190,1902,1903],{},"    graph=graph,\n",[190,1905,1906],{"class":192,"line":276},[190,1907,1908],{},"    verbose=True,\n",[190,1910,1911],{"class":192,"line":299},[190,1912,1913],{},"    allow_dangerous_requests=True, # should add control in real world\n",[190,1915,1916],{"class":192,"line":333},[190,1917,1211],{},[190,1919,1920],{"class":192,"line":433},[190,1921,424],{"emptyLinePlaceholder":423},[190,1923,1924],{"class":192,"line":444},[190,1925,1926],{},"def graph_retriever(query: str) -> str:\n",[190,1928,1929],{"class":192,"line":452},[190,1930,1931],{},"    return cypher_chain.invoke({\"query\": query})\n",[190,1933,1934],{"class":192,"line":463},[190,1935,424],{"emptyLinePlaceholder":423},[190,1937,1938],{"class":192,"line":479},[190,1939,1940],{},"graph_retriever_tool = Tool(\n",[190,1942,1943],{"class":192,"line":490},[190,1944,1945],{},"    name=\"graph_retriever_tool\",\n",[190,1947,1948],{"class":192,"line":498},[190,1949,1950],{},"    func=graph_retriever,\n",[190,1952,1953],{"class":192,"line":1311},[190,1954,1955],{},"    description=\"\"\"Retrieves detailed information about \n",[190,1957,1958],{"class":192,"line":1316},[190,1959,1960],{},"    articles, authors and topics from graph database.\n",[190,1962,1963],{"class":192,"line":1321},[190,1964,1279],{},[190,1966,1967],{"class":192,"line":1326},[190,1968,1211],{},[50,1970,1971,1972,1975],{},"I decided to bind this tool to a dedicated agent and build a multi-agent system mostly for experimentation purposes.\nBut the ",[54,1973,1974],{},"graph_retriever_tool"," can be used as a standalone tool for the manager agent, or even exposed through MCP as I did\nin the case of the data analytics.",[50,1977,1978],{},"I performed the tests suggested in the recipe. They are requests forcing the system to build complex cypher queries to traverse the graph\nsurch as",[1980,1981,1982],"blockquote",{},[50,1983,1984],{},"Are there any pair of researchers who have published more than three articles together?",[50,1986,1987],{},"and found the right answers ! The system was able to generate a complex query to answer and generate a coherent final response.",[87,1989,1991],{"id":1990},"gradio-for-chat-interaction","Gradio for chat interaction",[50,1993,1994,1995,2000],{},"The only missing piece to the puzzle is a way to interact with the system. That when ",[63,1996,1999],{"href":1997,"rel":1998},"https:\u002F\u002Fwww.gradio.app\u002F",[67],"Gradio"," comes into place.",[1100,2002],{":show-thumbnail":473,"platform":1102,"repo":2003},"gradio-app\u002Fgradio",[50,2005,2006,2007,2010],{},"It is an open-source Python package that allows to quickly build a demo or web application for AI models. I used the built-in\n",[54,2008,2009],{},"ChatInterface"," to create a simple chat webpage hosted locally to interact with the agent.",[45,2012,848],{"id":847},[50,2014,2015,2016,2021],{},"You can find the complete source code for this example on my ",[63,2017,2020],{"href":2018,"rel":2019},"https:\u002F\u002Fgitlab.com\u002FColinMietka\u002F",[67],"Gitlab",".\nKeep in mind that everything I presented here is evolving rapidly, is subject to change, and certainly can be improved !\nIf you have any questions or suggestions, feel free to reach out !",[1100,2023],{":show-thumbnail":473,"platform":888,"repo":2024},"ColinMietka\u002Flocal-assistant",[863,2026,2027],{},"html .light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html.light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":186,"searchDepth":200,"depth":200,"links":2029},[2030,2038,2042,2049],{"id":901,"depth":200,"text":902,"children":2031},[2032,2033,2034,2035,2036,2037],{"id":908,"depth":228,"text":909},{"id":929,"depth":228,"text":930},{"id":962,"depth":228,"text":963},{"id":1006,"depth":228,"text":1007},{"id":1033,"depth":228,"text":1034},{"id":1050,"depth":228,"text":1051},{"id":1063,"depth":200,"text":1064,"children":2039},[2040,2041],{"id":1067,"depth":228,"text":1068},{"id":1083,"depth":228,"text":1084},{"id":1090,"depth":200,"text":1091,"children":2043},[2044,2045,2046,2047,2048],{"id":1094,"depth":228,"text":1095},{"id":1227,"depth":228,"text":1228},{"id":1407,"depth":228,"text":1408},{"id":1773,"depth":228,"text":1774},{"id":1990,"depth":228,"text":1991},{"id":847,"depth":200,"text":848},"2025-09-15","Agentic AI and MCP are the new thing in 2025. I figured it's time to try them out and share the results. Witness the rise of my AI Agent, with Web Search, Data Analysis and Knowledge Graph enhanced RAG.","\u002Fposts\u002Fagents_mcp_rag_local_foss\u002Ffeatured.svg",{},{"title":10,"description":2051},[2056,2057,2058,2059,2060,2061,2062],"AI","LLM","RAG","MCP","Agent","Knowledge Graph","Embeddings","V24P4xSX4jv2UAscYHTYwuucI5zI9MCrMgrPRaPcrrQ",{"id":2065,"title":34,"body":2066,"date":2378,"description":2379,"extension":882,"image":2380,"meta":2381,"navigation":423,"path":35,"readingTime":885,"seo":2382,"stem":36,"tags":2383,"__hash__":2387},"content\u002Fposts\u002Fself_host_your_ai_assistant.md",{"type":42,"value":2067,"toc":2355},[2068,2072,2076,2079,2083,2086,2090,2093,2097,2100,2129,2133,2139,2143,2149,2153,2156,2160,2167,2171,2174,2193,2196,2210,2217,2220,2223,2227,2231,2237,2244,2247,2267,2270,2273,2277,2289,2292,2295,2299,2303,2306,2313,2320,2323,2330,2334,2337,2344,2347,2349,2352],[45,2069,2071],{"id":2070},"the-rise-of-ai-assistants","The rise of AI assistants.",[87,2073,2075],{"id":2074},"what-are-they","What are they ?",[50,2077,2078],{},"AI assistants have become an integral part of our daily lives, offering a wide range of functionalities from answering questions to automating tasks. These assistants are powered by advanced machine learning models that can process vast amounts of data and provide responses to user queries. AI assistants typically operate by using a combination of natural language processing (NLP) techniques, machine learning algorithms, and large datasets. They are trained on diverse information sources to understand and generate human-like responses. When a user interacts with an AI assistant, the system processes their input, analyzes it for context and intent, and then generates a relevant response based on its training.",[87,2080,2082],{"id":2081},"the-open-source-claim","The open source claim",[50,2084,2085],{},"Most of the AI assistant on the market are proprietary software owned by giants (or by rapidly growing ones) like OpenAI, Meta or Anthropic. Even if the companies are advertising their models are open source, the reality is that the models they provide and the data used to train them are not open to the public or free to use. There is also a growing concern about the copyright issues surrounding the training phase of these models. The large datasets gathered from all over the internet often contain all kind of licenced content and do not respect the rights of content creators. This lack of transparency and control over the data used in training can lead to ethical concerns and potential misuse of the technology.",[87,2087,2089],{"id":2088},"the-cost-of-using-ai-assistants","The cost of using AI assistants.",[50,2091,2092],{},"Even if most of the cloud-based services offer free tiers of usage, the rapid growth of the sector has led to a rise in costs and limited access to premium features, kept behind paywalls. This can be particularly problematic for small businesses or individuals who may not have the budget to afford them. The paywall model can also create a digital divide, where only those with the financial means can access advanced AI capabilities.",[87,2094,2096],{"id":2095},"what-about-self-hosting","What about self-hosting ?",[50,2098,2099],{},"Self-hosting an AI assistant seems like a perfect answer to the above concerns. You can keep control of your data and if you choose one of the many 'open source' models, you can basically run it for free.\nEven if you choose to run AI services on your local machine, they still require significant computational resources. In theory, this means that you need a powerful computer with a good GPU to handle the processing demands. But you can also run models on CPU only, albeit at a slower rate. All of this can be achieved with a few different tools:",[102,2101,2102,2110,2121],{},[105,2103,2104,2109],{},[63,2105,2108],{"href":2106,"rel":2107},"https:\u002F\u002Follama.com\u002F",[67],"Ollama"," to run the models",[105,2111,2112,2116,2117,2120],{},[63,2113,2115],{"href":1422,"rel":2114},[67],"Page Assist"," (or ",[63,2118,1430],{"href":1428,"rel":2119},[67],") to interact with the models from your browser",[105,2122,2123,2128],{},[63,2124,2127],{"href":2125,"rel":2126},"https:\u002F\u002Fwww.continue.dev\u002F",[67],"Continue"," for IDE integration",[45,2130,2132],{"id":2131},"install-and-run-llms-with-ollama","Install and run LLMs with Ollama",[50,2134,2135],{},[156,2136],{"alt":949,"src":2137,"title":2138},"\u002Fposts\u002Fself_host_your_ai_assistant\u002Follama_name.svg","[Ollama](https:\u002F\u002Follama.com\u002F) is a tool to run LLMs locally.",[87,2140,2142],{"id":2141},"what-is-ollama","What is Ollama ?",[50,2144,2145,2148],{},[63,2146,2108],{"href":2106,"rel":2147},[67]," is a tool designed to run large language models (LLMs) locally on your computer. It allows you to access a variety of pre-trained models, ranging from versatile general-purpose models to specialized ones for specific domains or tasks. Some of the supported models include LLaMA-2, CodeLLaMA, Falcon, Mistral, WizardCoder, and more. Ollama simplifies the process of downloading and managing these models (its usage resembles docker), offering a user-friendly experience for those who want to use advanced language models directly on their system.",[87,2150,2152],{"id":2151},"prerequisites","Prerequisites",[50,2154,2155],{},"As you might know from previous posts, I'm running a Archlinux system on a somewhat powerful PC with a dedicated AMD GPU with enough VRAM to run at least the basic models from Ollama. You should also be aware that all AMD GPUs are not compatible with the rocm framework that allows models to run on GPU. For example, in my case I needed to install specific packages and customize the configuration of Ollama to be able to make models run on GPU. Depending on you personal hardware and system, the following guidelines may differ, and you should always have a look at the documentation.",[87,2157,2159],{"id":2158},"install","Install",[50,2161,2162,2163,2166],{},"From the documentation, you can find how to install Ollama on your system. I decided to go directly to the official package released by my distribution and activate the systemd service to start Ollama automatically at boot time. You probably can do the same on most of them.\nI also went for the ",[82,2164,2165],{},"rocm"," version that allows the use of models on an AMD GPU. Again, depending on you system you may need to follow a different installation process.",[87,2168,2170],{"id":2169},"download-and-run-models","Download and run models",[50,2172,2173],{},"After installing Ollama, you can download and run models. The documentation provides a list of available models that you can choose from. You can also create your own models or customize their properties. For example, to download Llama 3.2 you can run the following command:",[181,2175,2179],{"className":2176,"code":2177,"language":2178,"meta":186,"style":186},"language-bash shiki shiki-themes material-theme-lighter material-theme material-theme-palenight","ollama pull llama3.2\n","bash",[54,2180,2181],{"__ignoreMap":186},[190,2182,2183,2187,2190],{"class":192,"line":193},[190,2184,2186],{"class":2185},"sBMFI","ollama",[190,2188,2189],{"class":219}," pull",[190,2191,2192],{"class":219}," llama3.2\n",[50,2194,2195],{},"it will download the model from the official repository and store it locally. You can then run the model using the following command:",[181,2197,2199],{"className":2176,"code":2198,"language":2178,"meta":186,"style":186},"ollama run llama3.2\n",[54,2200,2201],{"__ignoreMap":186},[190,2202,2203,2205,2208],{"class":192,"line":193},[190,2204,2186],{"class":2185},[190,2206,2207],{"class":219}," run",[190,2209,2192],{"class":219},[50,2211,2212],{},[156,2213],{"alt":2214,"src":2215,"title":2216},"Ollama prompt in a terminal","\u002Fposts\u002Fself_host_your_ai_assistant\u002Fprompt.png","The prompt you see in a terminal when running an Ollama model.",[50,2218,2219],{},"You will be presented with a prompt in which you can input your query. To close the session, hit Ctrl + d. Ollama does not access the internet and all the chat history you have is stored locally. It has many features, you can for example customize the way each models answers to you or create specific prompts but I won't cover them here, I'll let you scroll through the documentation.",[50,2221,2222],{},"Now, the basic TUI interface can be enough for some users but in most cases, you probably want your queries to interact with the web (and expose citations) for certain tasks and you may also prefer a nicer interface.",[45,2224,2226],{"id":2225},"add-a-gui-interface-to-interact-with-your-models-with-page-assist","Add a GUI interface to interact with your models with Page Assist",[87,2228,2230],{"id":2229},"page-assist-web-browser-extension","Page Assist web browser extension",[50,2232,2233,2236],{},[63,2234,2115],{"href":1422,"rel":2235},[67]," is an open source web browser extension available for chromium and firefox based browsers. It allows you to interact with your local models directly from the web pages you are visiting, providing real-time responses and citations. You can use a full tab interface or a sidebar, you can also use keyboard shortcuts to quickly access it.",[50,2238,2239],{},[156,2240],{"alt":2241,"src":2242,"title":2243},"Page assist extension on Firefox","\u002Fposts\u002Fself_host_your_ai_assistant\u002Fpage_assist.png","The prompt you see in page assist extension.",[50,2245,2246],{},"In the settings page, you can set up the connection with the local Ollama server and interact with the different features of the extension:",[102,2248,2249,2255,2261],{},[105,2250,2251,2254],{},[82,2252,2253],{},"Model Management",": You can easily download, update, or remove models.",[105,2256,2257,2260],{},[82,2258,2259],{},"Custom Prompts",": Define custom prompts to tailor the behavior of your AI assistant.",[105,2262,2263,2266],{},[82,2264,2265],{},"Retrieval-Augmented Generation (RAG)",": Add knowledge documents to enhance the responses with relevant information.",[50,2268,2269],{},"The sidebar view is particularly useful when interacting with a specific web page. This way you can easily ask your local assistant to summarize the main article on the page, or translate it to another language.",[50,2271,2272],{},"Enabling the web search feature can be very handy when trying to get more context on a topic, or when you need to find specific information.",[87,2274,2276],{"id":2275},"if-you-need-more-there-is-openwebui","If you need more, there is OpenWebUI",[50,2278,2279,2280,2283,2284,152],{},"There are more powerful options than the simple Page Assist extension. One of the most popular is ",[63,2281,1430],{"href":1428,"rel":2282},[67],". It offers the same features as Page Assist and probably more advanced ones like the ability to use proprietary models like ChatGPT or options for ",[63,2285,2288],{"href":2286,"rel":2287},"https:\u002F\u002Fdocs.openwebui.com\u002Ftutorials\u002Fimages",[67],"text-to-image generation",[50,2290,2291],{},"OpenWebUI is also an open-source project, the service can be deployed as a docker container and the web interface is then accessible on your local machine or exposed to the internet if you want to.",[50,2293,2294],{},"I did not try it yet, and it is definitely on my list of things to explore.",[45,2296,2298],{"id":2297},"what-about-a-coding-assistant-in-your-favorite-ide","What about a coding assistant in your favorite IDE?",[87,2300,2302],{"id":2301},"continue-for-vscode-and-jetbrains-ides-integration","Continue for VSCode and JetBrains IDEs integration",[50,2304,2305],{},"Obviously, as a tech enthusiast, one of the main benefit of this kind of AI assistant is through coding assistance. Let's see how we can integrate it into our favorite IDEs.",[50,2307,2308,2309,2312],{},"After a quick web search, I found ",[63,2310,2127],{"href":2125,"rel":2311},[67],", an IDE plugin available for VSCode and JetBrains IDEs (like PyCharm, IntelliJ IDEA, WebStorm, etc.). It offers the same kind of features as the integrated options these IDE offer but Continue is able to connect to your local Ollama server and use your local models either to generate code or provide suggestions.",[50,2314,2315,2316,2319],{},"You can associate your Ollama models through the ",[54,2317,2318],{},".continue\u002Fconfig.json"," file settings and these configurations will be the same in all IDEs you have installed Continue on.",[50,2321,2322],{},"This is a great way to keep your coding assistance local, secure, and under your control. As I never used a proprietary coding assistant like GitHub Copilot, I couldn't tell if the performance of a local model like Qwen2.5-Coder I'm using right now are better or worse. But I can definitely say that its suggestions are often useful and relevant, even if I could live without them. The code generation prompt is not a feature I use much but the code completion feature has already avoided me some annoying bug fixes just by checking the coherence of the syntax.",[50,2324,2325],{},[156,2326],{"alt":2327,"src":2328,"title":2329},"Logo of the Continue AI assistant extension","\u002Fposts\u002Fself_host_your_ai_assistant\u002Fcontinue.png","Logo of the [Continue](https:\u002F\u002Fwww.continue.dev\u002F) AI assistant extension",[87,2331,2333],{"id":2332},"disclaimer","Disclaimer",[50,2335,2336],{},"And here is the last point I wanted to make, the continue AI assistant is not only completing code but also docstrings, headers, comments, commit messages, and also markdown articles like the one I am writing now...",[50,2338,2339,2340,2343],{},"So you see me coming, parts of what you read before has been written by AI. Not all of it I assure you, and the experience has not been so great for me. For general contexts sentences, the auto-completion has been at most ",[77,2341,2342],{},"inspiring"," but what it produced was often not usable as is. But in more technical parts, it can become a real pain to use. Sometimes, the AI would suggest things that were not relevant or didn't match the context at all. Then it becomes just a distraction and pollutes what you are trying to write.",[50,2345,2346],{},"So, while I am excited about the potential of AI in software development, I am still cautious about relying too heavily on it. It's a tool that can be very helpful when used correctly, but it's important to remember that it's not perfect and should be used with care.",[45,2348,848],{"id":847},[50,2350,2351],{},"Reflecting on my experience with AI tools like Page Assist and Ollama, I approach their use with cautious optimism. These tools can effectively enhance productivity when used wisely. The ability to self-host these models provides crucial benefits, including local control, enhanced privacy and cost-efficiency which I value highly. While they can be helpful, I remain mindful of their limitations and ensure they serve as aids rather than replacements for human decision-making.",[863,2353,2354],{},"html pre.shiki code .sBMFI, html code.shiki .sBMFI{--shiki-light:#E2931D;--shiki-default:#FFCB6B;--shiki-dark:#FFCB6B}html pre.shiki code .sfazB, html code.shiki .sfazB{--shiki-light:#91B859;--shiki-default:#C3E88D;--shiki-dark:#C3E88D}html .light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html.light .shiki span {color: var(--shiki-light);background: var(--shiki-light-bg);font-style: var(--shiki-light-font-style);font-weight: var(--shiki-light-font-weight);text-decoration: var(--shiki-light-text-decoration);}html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}html.dark .shiki span {color: var(--shiki-dark);background: var(--shiki-dark-bg);font-style: var(--shiki-dark-font-style);font-weight: var(--shiki-dark-font-weight);text-decoration: var(--shiki-dark-text-decoration);}",{"title":186,"searchDepth":200,"depth":200,"links":2356},[2357,2363,2369,2373,2377],{"id":2070,"depth":200,"text":2071,"children":2358},[2359,2360,2361,2362],{"id":2074,"depth":228,"text":2075},{"id":2081,"depth":228,"text":2082},{"id":2088,"depth":228,"text":2089},{"id":2095,"depth":228,"text":2096},{"id":2131,"depth":200,"text":2132,"children":2364},[2365,2366,2367,2368],{"id":2141,"depth":228,"text":2142},{"id":2151,"depth":228,"text":2152},{"id":2158,"depth":228,"text":2159},{"id":2169,"depth":228,"text":2170},{"id":2225,"depth":200,"text":2226,"children":2370},[2371,2372],{"id":2229,"depth":228,"text":2230},{"id":2275,"depth":228,"text":2276},{"id":2297,"depth":200,"text":2298,"children":2374},[2375,2376],{"id":2301,"depth":228,"text":2302},{"id":2332,"depth":228,"text":2333},{"id":847,"depth":200,"text":848},"2025-04-02","AI assistants are now commonly used for various tasks, powered by advanced machine learning models. While cloud-based services is the go to for most people, concerns over data privacy have led me to explore self-hosting as a cost-effective alternative.","\u002Fposts\u002Fself_host_your_ai_assistant\u002Ffeatured.svg",{},{"title":34,"description":2379},[2056,2057,2058,2384,2385,2386],"Homelab","Privacy","Code","46vfc0qR_pNJRyEkHZMzKJY-eV2tTVYaS9dMkr2iMS8",{"id":2389,"title":14,"body":2390,"date":2633,"description":2634,"extension":882,"image":2635,"meta":2636,"navigation":423,"path":15,"readingTime":885,"seo":2637,"stem":16,"tags":2638,"__hash__":2644},"content\u002Fposts\u002Fdegoogle_your_phone.md",{"type":42,"value":2391,"toc":2616},[2392,2396,2400,2403,2410,2414,2439,2443,2451,2464,2471,2475,2479,2488,2492,2501,2504,2508,2517,2535,2538,2564,2580,2584,2591,2595,2598,2602,2609,2613],[45,2393,2395],{"id":2394},"should-you-care-about-the-system-on-your-phone","Should you care about the system on your phone ?",[87,2397,2399],{"id":2398},"google-or-apple-pick-your-poison","Google or Apple, pick your poison.",[50,2401,2402],{},"It's been at least ten years now that mobile phones are completely part of our daily life. We have them within reach at all times. We use them as phone for sure, but also to access the web, organise our life, share on social media, ...\nAnd for the last ten years at least, the market has been dominated by two giants: Google and Apple. They developed the base systems on which our devices run, namely Android and iOS. And of course,\nthey use now this dominant position to slowly shape our day-to-day devices into some sort of surveillance systems. They extract and share our personal data with third parties, being public or private, for money.\nIn the name of convenience, they predict and shape our behaviors.\nBut not all is lost, it turns out that the base Android system is open-source. It's called Android Open Source Project (AOSP) and this opens the door to other possible operating systems for our phones.",[50,2404,2405],{},[156,2406],{"alt":2407,"src":2408,"title":2409},"LineageOS Logo","\u002Fposts\u002Fdegoogle_your_phone\u002FLineageOS.svg","LineageOS is one of the most popular free Android ROM.",[87,2411,2413],{"id":2412},"what-are-custom-roms","What are Custom ROMs ?",[50,2415,2416,2417,2422,2423,1804,2428,619,2433,2438],{},"An Android ROM (Read-Only Memory) is an alternative version of Android. It can be built in top of the open source base project. There are a lot a different ROMs available, and you can find a list,\nprobably non-exhaustive on ",[63,2418,2421],{"href":2419,"rel":2420},"https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FList_of_custom_Android_distributions",[67],"Wikipedia",". Everybody could build he's own system and customize it as he needs.\nThen, some of the developers decided to release their version and make it available for others. The most famous are ",[63,2424,2427],{"href":2425,"rel":2426},"https:\u002F\u002Flineageos.org\u002F",[67],"LineageOS",[63,2429,2432],{"href":2430,"rel":2431},"https:\u002F\u002Fgrapheneos.org\u002F",[67],"GrapheneOS",[63,2434,2437],{"href":2435,"rel":2436},"https:\u002F\u002Fe.foundation\u002F",[67],"e\u002FOS\u002F",".\nOf course there are many more options but when you first search the internet to learn more about them, they are the 3 most discussed options.",[87,2440,2442],{"id":2441},"eos-an-open-source-and-privacy-respecting-ecosystem","e\u002FOS\u002F, an open-source and privacy respecting ecosystem.",[50,2444,2445,2446,152],{},"I had a closer look at e\u002FOS\u002F. They propose not only the base system but also a complete ecosystem of privacy respecting applications that completely replaces the Google suite that you have on your phone.\nIt is based on LineageOS which is already a pretty complete system, and they add applications on top of it. You can find all the details you need on their ",[63,2447,2450],{"href":2448,"rel":2449},"https:\u002F\u002Fe.foundation\u002Fe-os\u002F",[67],"e\u002FOS\u002F page",[50,2452,2453,2454,2458,2459,152],{},"One of the main downside with the custom ROMs, that we will discuss in a moment, is the device compatibility It looks like e\u002FOS\u002F has something for a lot of them, being officially supported by the foundation\nor just by the community. Of course, all they do is open-source, and if you can\u002Fwant to contribute, you can find everything on their ",[63,2455,888],{"href":2456,"rel":2457},"https:\u002F\u002Fgitlab.e.foundation\u002Fe",[67],".\nYou can also learn more about their mission and goals in their ",[63,2460,2463],{"href":2461,"rel":2462},"https:\u002F\u002Fe.foundation\u002Fabout-e\u002F",[67],"Manifesto",[50,2465,2466],{},[156,2467],{"alt":2468,"src":2469,"title":2470},"A screenshot from e\u002FOS\u002F website","\u002Fposts\u002Fdegoogle_your_phone\u002Feos_website.png","Screenshot from [e\u002FOS\u002F](https:\u002F\u002Fe.foundation\u002F) website.",[45,2472,2474],{"id":2473},"can-i-switch-my-phone-to-eos","Can I switch my phone to e\u002FOS\u002F ?",[87,2476,2478],{"id":2477},"supported-devices","Supported devices",[50,2480,2481,2482,2487],{},"Now that I got you attracted with all the good promises of a system like e\u002FOS\u002F, it's time to come back to reality. Installing a custom software on your current Android phone is not that easy to do.\nIn general, the devices supported are well documented, and you can get instructions in the documentation.\nAt the time I write these lines, e\u002FOS\u002F supports almost 200 devices, but the list of officially supported (by the foundation) is much shorter.\nFor a complete list of supported devices, go check the ",[63,2483,2486],{"href":2484,"rel":2485},"https:\u002F\u002Fdoc.e.foundation\u002Fdevices",[67],"device list",".\nIn short, the support won't be the same for all devices. For the majority of them, the maintenance is done by community members.\nIt means installing and updating may not be easy, and some of the functionalities may be missing, causing some apps to be unusable (typically banking apps).\nFor the officially supported devices though, you get a simple install, updates over-the-air (OTA) and all apps working, exactly as you would expect from a stock Google Android device.\nSo when trying to see if you could get a shot at e\u002FOS\u002F or any other ROM, definitely take a look at these device lists !",[87,2489,2491],{"id":2490},"installation-method","Installation method",[50,2493,2494,2495,2500],{},"I'll present the two main methods of installation for e\u002FOS\u002F as I tried both of them for two different devices. Of course, these methods vary for other systems so go check out the dedicated documentation.\nRegarding e\u002FOS\u002F, you basically have two methods, the manual one with the command line (different for each device), and their ",[63,2496,2499],{"href":2497,"rel":2498},"https:\u002F\u002Fdoc.e.foundation\u002Feasy-installer",[67],"easy-installer"," application.\nhe easy-installer supports only a few devices for now, but if you're not comfortable with the command line, it's a life-saver !\nIf you're familiar with the command line and not too afraid of breaking your phone, you may try the manual installation.\nIt certainly gives you the feeling you've learned something when you reboot the phone and the e\u002FOS\u002F welcome screen appears.",[50,2502,2503],{},"There is a third option, which is building your own ROM from source. It's the most complicated option, too complicated for me, but it probably offers a lot of customizations that you way want to test.\nThis is also probably the way to try and port the system to a new device. I would advise this method only for actual developers who know what they're doing...",[87,2505,2507],{"id":2506},"my-choices-and-experience-with-eos","My choices and experience with e\u002FOS\u002F",[50,2509,2510,2511,2516],{},"I had no experience with these custom Android ROMs before and e\u002FOS\u002F is the first one I tested. I compared LineageOS, e\u002FOS\u002F and other options without Android\nlike ",[63,2512,2515],{"href":2513,"rel":2514},"https:\u002F\u002Fpostmarketos.org\u002F",[67],"postmarketOS"," that I will probably try in the future.\nI went for e\u002FOS\u002F just because I found some reviews (that I can't find anymore) that were really great. It also matched better with the devices I had available at the time.\nIn the end, I feel that it just comes to what OS my device can support. Because at least for a first try, I didn't want to buy a new, or even refurbished device.",[50,2518,2519,2520,2523,2524,2528,2529,2534],{},"I tested a first install on an ",[77,2521,2522],{},"old"," Samsung A20e (it was sitting in a drawer) which is only community supported by e\u002FOS\u002F. I found a good ",[63,2525,958],{"href":2526,"rel":2527},"https:\u002F\u002Fcommunity.e.foundation\u002Ft\u002Fhowto-an-unified-install-guide-project-for-e\u002F36234",[67],"\non the community forum that explains the general method to install e\u002FOS\u002F on a Samsung device. I had to download a ROM created by a community called ",[63,2530,2533],{"href":2531,"rel":2532},"https:\u002F\u002Feurekadevelopment.github.io\u002F",[67],"Eureka"," and followed the instruction to\nunlock the phone's bootloader, install TWRP, the recovery tool that is used to install the actual e\u002FOS\u002F system, and then install e\u002FOS\u002F itself.",[50,2536,2537],{},"Technically, at some point while I was following the installation guide, I felt like I destroyed my phone. But in the end, I managed to make it reboot into e\u002FOS\u002F. What a relief !\nI used it a couple of weeks, and I could tell it has a good potential but some things were not working at all.\nFirst, almost every application were working, even the most demanding in security. Banking apps were completely ok, but the main issue was with the bootloader unlocking.",[50,2539,2540,2541,2544,2545,2548,2549,1425,2554,2559,2560,2563],{},"Because you need to unlock it to install e\u002FOS\u002F, some applications consider the phone ",[77,2542,2543],{},"insecure",". I discovered later that you can re-lock the bootloader, but not on all devices !\nAfter that, installing application worked, it was not perfect because the incognito mode on the e\u002FOS\u002F store called ",[77,2546,2547],{},"App Lounge"," was not working well.\nI still managed to run all I wanted via ",[63,2550,2553],{"href":2551,"rel":2552},"https:\u002F\u002Fauroraoss.com\u002F",[67],"Aurora",[63,2555,2558],{"href":2556,"rel":2557},"https:\u002F\u002Ff-droid.org\u002F",[67],"F-Droid",". As a community version, it was a bit outdated with only ",[77,2561,2562],{},"Android 12"," if I remember well and older security updates.\nSometimes, I think I may be due to some mistakes in the installation process but who knows.",[50,2565,2566,2567,2572,2573,2576,2577,152],{},"The experience was good enough for me to make the switch. But this time, I decided to search for a well-supported device and buy a refurbished one.\nI went first to the e\u002FOS\u002F foundation (Murena) ",[63,2568,2571],{"href":2569,"rel":2570},"https:\u002F\u002Fmurena.com\u002Fproducts\u002Fsmartphones\u002F",[67],"store",". They sell and ship smartphone with preinstalled e\u002FOS\u002F.\nI also checked the officially supported devices in the ",[63,2574,2486],{"href":2484,"rel":2575},[67]," and searched for the good smartphone for me. Of course, this is just a personal choice, but even if\nI considered a FairPhone with all the good will it has to sell, I thought the device not suiting my needs (which is battery life mostly). Ironically, it seems that one of the most supported devices\n(not only for e\u002FOS\u002F but also for other systems) is the Google Pixel 5. What a joke ! So I bought one and this time I used the easy-installer. It was ",[77,2578,2579],{},"flowless",[45,2581,2583],{"id":2582},"what-to-expect-after-installing","What to expect after installing",[50,2585,2586],{},[156,2587],{"alt":2588,"src":2589,"title":2590},"A screenshot from e\u002FOS\u002F website with home screen and applications","\u002Fposts\u002Fdegoogle_your_phone\u002Feos_apps.png","Screenshot from [e\u002FOS\u002F](https:\u002F\u002Fe.foundation\u002Fe-os\u002F) website, with home screen and applications.",[87,2592,2594],{"id":2593},"its-just-working-as-any-other-android-phone","It's just working as any other Android phone",[50,2596,2597],{},"It's now been a few month using e\u002FOS\u002F daily. The installation process went super well and I have exactly all I needed. It's just working as my previous Android phone except I don't have the preinstalled applications, either by Google or\nthe manufacturer (think about all the Samsung Apps nobody uses and that cannot be uninstalled). Banking Apps work perfectly, NFC, Bluetooth, Wi-Fi, Health care, everything. All without any Google account, no adds, no trackers, ...\nIt's truly my phone.",[87,2599,2601],{"id":2600},"safetynet-and-bootloader-relocking","SafetyNet and bootloader relocking",[50,2603,2604,2605,2608],{},"As I said before, bootloader relocking was a problem in my first attempt. Without it, some of the apps (Doctolib for health care for example) were just not working.\nWith the good device and the easy-installer, I relocked the bootloader after installation and never had any issue. Something that I didn't encounter but can be blocking for you is the SafetyNet. Even if I don't understand the details of it,\nit is mostly needed for banking applications so make sure that you choose a device that supports it. For me, with the official Pixel 5 ",[77,2606,2607],{},"redfin",", all good !",[87,2610,2612],{"id":2611},"try-it-out","Try it out !",[50,2614,2615],{},"You have an old phone in the closet, I know it ! Just don't be afraid to break it, and try e\u002FOS\u002F or anything else you prefer. I'm sure you will be tempted, as I was, to just change the way you live with your phone.",{"title":186,"searchDepth":200,"depth":200,"links":2617},[2618,2623,2628],{"id":2394,"depth":200,"text":2395,"children":2619},[2620,2621,2622],{"id":2398,"depth":228,"text":2399},{"id":2412,"depth":228,"text":2413},{"id":2441,"depth":228,"text":2442},{"id":2473,"depth":200,"text":2474,"children":2624},[2625,2626,2627],{"id":2477,"depth":228,"text":2478},{"id":2490,"depth":228,"text":2491},{"id":2506,"depth":228,"text":2507},{"id":2582,"depth":200,"text":2583,"children":2629},[2630,2631,2632],{"id":2593,"depth":228,"text":2594},{"id":2600,"depth":228,"text":2601},{"id":2611,"depth":228,"text":2612},"2024-10-10","With nowadays smartphone, you can do anything. And the two main systems reign supreme on the market. But if you value your privacy, another path is possible. It is called e\u002FOS\u002F. It's open source, it's free, and it doesn't spy on you. Let's see what it is and how to finally switch your phone to it !","\u002Fposts\u002Fdegoogle_your_phone\u002Ffeatured.svg",{},{"title":14,"description":2634},[2639,2640,2385,2641,2642,2643],"Phone","Android","e\u002FOS\u002F ","Murena","ROM","bzmJ9XMOJ1p-35AbjApiKAEOGtx8x_DFj8e9X6XFzJY",{"id":2646,"title":30,"body":2647,"date":2923,"description":2924,"extension":882,"image":2925,"meta":2926,"navigation":423,"path":31,"readingTime":885,"seo":2927,"stem":32,"tags":2928,"__hash__":2930},"content\u002Fposts\u002Fown_your_data.md",{"type":42,"value":2648,"toc":2901},[2649,2653,2657,2664,2668,2697,2704,2708,2715,2719,2730,2734,2738,2744,2751,2755,2774,2781,2784,2787,2795,2799,2830,2837,2841,2845,2848,2852,2867,2871,2874,2883,2887,2890,2894],[45,2650,2652],{"id":2651},"how-i-started-my-homelab","How I started my homelab",[87,2654,2656],{"id":2655},"what-do-i-do-with-my-old-hardware","What do I do with my old hardware ?",[50,2658,2659,2660,2663],{},"A good question right ? I bought a new computer before the old one turned useless, probably like many people do. Nobody had a use for it except me at the time so it slept in a closet for almost 2 years until I decided to revive it.\nMy idea was to remove the old mac os system and replace it with a ",[77,2661,2662],{},"new linux user"," friendly distro for the rest of the family. I went for Linux Mint, an Ubuntu based distro with a good reputation among new users. And as an Ubuntu base, I was gonna be able to practice a few skills on server management, and maybe turn the old laptop into a media center. At least that's what I thought at start.",[87,2665,2667],{"id":2666},"get-started","Get started",[50,2669,2670,2671,1804,2676,619,2681,2686,2687,2692,2693,2696],{},"As often for me, the journey was as important as the destination. My goal was not only to use the built server in the end, but also to learn how to put it in place and manage it. I found a few Youtube's channels to get ideas and good practices on how to build a first homelab. I can recommend ",[63,2672,2675],{"href":2673,"rel":2674},"https:\u002F\u002Fwww.youtube.com\u002F@TechnoTim",[67],"TechnoTim",[63,2677,2680],{"href":2678,"rel":2679},"https:\u002F\u002Fwww.youtube.com\u002F@WolfgangsChannel",[67],"Wolfgang's Channel",[63,2682,2685],{"href":2683,"rel":2684},"https:\u002F\u002Fwww.youtube.com\u002F@christianlempa",[67],"Christian Lempa",". The most useful for me has been the one of TechnoTim, accompanied by his ",[63,2688,2691],{"href":2689,"rel":2690},"https:\u002F\u002Ftechnotim.live\u002F",[67],"documentation website",". You will find in here all kinds of tutorials that can help you understand what you need to know about hardware, networking, containers and how to configure them. In fact, after a few days researching on the subject, I found myself discovering a complete community of ",[77,2694,2695],{},"Homelabers",". Everyone has its own use-cases and preferences but you learn a lot just by looking at what others do and publish on the web.",[50,2698,2699],{},[156,2700],{"alt":2701,"src":2702,"title":2703},"Docker Logo","\u002Fposts\u002Fown_your_data\u002Fdocker.png","[Docker](https:\u002F\u002Fwww.docker.com\u002F) is a great solution to run services as containers.",[87,2705,2707],{"id":2706},"know-what-you-want-to-do","Know what you want to do",[50,2709,2710,2711,2714],{},"Having a lot of options, and as much advice is great. But the drawback is that you need to focus on what you need and want to do. You will find on the internet that people do not stop at a simple computer to build homelabs sometimes. You can be overwhelmed by information on hardware, proxies, networks, and so on. People are building entire data centers in their garage, at a prohibitive cost obviously. In my case, I just want a few services: storing pictures and files, serving movies to other devices at home, maybe have my own gitlab instance to test things out. In short, I decided I was ",[82,2712,2713],{},"not"," gonna build a noisy and power hungry setup just for fun !",[87,2716,2718],{"id":2717},"what-about-cost","What about cost ?",[50,2720,2721,2722,2725,2726,2729],{},"Another aspect to have in mind. You plan to have a computer running 24\u002F7 at your house. What's the power bill behind ? Using an old computer, not made for this use case is probably a bad idea. You can find low power consumptions PCs, or ",[77,2723,2724],{},"mini PCs"," maybe, but I find them to lack in power and storage. It's probably why people on the internet build entire networks for their homelab. They couple storage devices, NAS, networking devices like firewalls and VMs using ",[77,2727,2728],{},"hypervisors"," to run the containers. Well, for the moment, I only have the old laptop, so I power it on just when I need it. It will do for the time being but I plan to find a good setup, with enough power and storage to support my use case (which is not too much hungry I guess) and with a low power consumption. The perfect setup do not exists probably, but I'll get as close as possible. But as I said, I don't want to build a data center at home, I'll stick with only one machine.",[45,2731,2733],{"id":2732},"my-current-setup","My current setup",[87,2735,2737],{"id":2736},"the-old-laptop-got-a-second-life","The old laptop got a second life",[50,2739,2740,2741,2743],{},"As I said, I installed Linux Mint on my old Macbook Pro from 2013. It was declining more and more and needed to have a cleanup. I know Linux Mint is not a perfect ",[77,2742,989],{}," distro but it's not the main use case of a laptop after all. Mint is easy to use and because of it's proximity to Ubuntu (on the system side of things) it's a close enough choice to get started. It has 8Go RAM and a 250Go SSD storage drive. Both felt kind of small, as you'll see bellow. But I have nothing to say about the distro itself. I found all application I needed and maintenance is much easier than on my Archlinux desktop. The Cinnamon desktop environment is good enough for the whole family usage, even if when I use it, I miss my KDE setup a lot ! But in the end, everything feels as if it was just a brand new laptop. Which, for a 10 year old device is really good.",[50,2745,2746],{},[156,2747],{"alt":2748,"src":2749,"title":2750},"Nextcloud cloud platform logo","\u002Fposts\u002Fown_your_data\u002Fnextcloud.png","[Nextcloud](hhttps:\u002F\u002Fnextcloud.com\u002F), the open-source, on-premises content collaboration platform",[87,2752,2754],{"id":2753},"core-services-i-wanted-to-have","Core services I wanted to have",[50,2756,2757,2758,2763,2764,2767,2768,2773],{},"The whole homelab experiment started for one particular service, and it is ",[63,2759,2762],{"href":2760,"rel":2761},"https:\u002F\u002Fnextcloud.com\u002F",[67],"Nextcloud",". Basically, it offers the same kind of services as the Google suite: calendar, file storage, parallel editing, pictures management, etc... but it's free and open source. I quickly faced challenges to put the necessary containers in place. I'm not too familiar with Docker and just have basic understanding of how to create containers and deploy them. So I decided to look at something simpler and also at a Docker utility called ",[63,2765,74],{"href":72,"rel":2766},[67]," to help me manage the containers. After spinning up Portainer, I started Looking at ",[63,2769,2772],{"href":2770,"rel":2771},"https:\u002F\u002Fjellyfin.org\u002F",[67],"Jellyfin",", a FOSS media solution. It was much easier to deal with and it felt good to have my first service running.",[50,2775,2776],{},[156,2777],{"alt":2778,"src":2779,"title":2780},"Homepage Service example showing various services running","\u002Fposts\u002Fown_your_data\u002Fhomepage.png","[Homepage](https:\u002F\u002Fgethomepage.dev\u002Flatest\u002F) Service example showing various services running.",[50,2782,2783],{},"Of course, all of it was still running on local network, which is great but why not share the movies library with the rest of the family ? And the really technical part started. I'll give more details about it bellow but just have in mind that networking, IP addresses, DNS records, proxy servers are the basa of it all. You need to have at least a basic understanding of networking to have your homelab running safely from home and access services remotely.",[50,2785,2786],{},"After Jellyfin, I came back to Nextcloud and had it working as well. It's a wonderful solution that I use now to synchronize the pictures from family smartphones. It's easy to set up with their dedicated app on Android. I'd very much like to store a complete copy of the photos library but the laptop lacks storage. It will have to wait... I still have to explore the many possibilities with Nextcloud. It has a huge library of apps and plugins to help you do whatever you would want to do. I just tested a with things about synchronizing contacts and calendars between Android and my desktop Thunderbird client.",[50,2788,2789,2790,2794],{},"The also deployed my own ",[63,2791,2020],{"href":2792,"rel":2793},"https:\u002F\u002Fdocs.gitlab.com\u002Fee\u002Finstall\u002Fdocker.html",[67]," instance. I did not have much experience with the use of their runners and pipelines and wanted to test things out. Turns out that you can already build pipelines and use free runners directly with your free account on Gitlab.com, at least when you don't use too much compute time. But it was still a great experience to try deploying it and have access to the backend settings of such a tool.",[87,2796,2798],{"id":2797},"the-technical-part","The technical part",[50,2800,2801,2802,2807,2808,2813,2814,1804,2819,619,2824,2829],{},"Up until this point, everything was going smoothly. The only difficulty I faced was opening the ports of my router, set up the reverse proxy using ",[63,2803,2806],{"href":2804,"rel":2805},"https:\u002F\u002Fdoc.traefik.io\u002Ftraefik\u002F",[67],"Traefik"," and SSL certificates from ",[63,2809,2812],{"href":2810,"rel":2811},"https:\u002F\u002Fwww.cloudflare.com",[67],"Cloudflare",". So the networking part was a struggle. I had no issues regarding the server load until I added the Gitlab container. Suddenly, the old laptop's 8Go of RAM where looking too small. How did I notice ? First by the fan noise... And then I added a monitoring stack of containers to keep an eye on the server load from a distance. For the moment I use a combination of ",[63,2815,2818],{"href":2816,"rel":2817},"https:\u002F\u002Fprometheus.io",[67],"Prometheus",[63,2820,2823],{"href":2821,"rel":2822},"https:\u002F\u002Fgrafana.com\u002Foss\u002Floki\u002F?pg=logs&plcmt=options",[67],"Loki",[63,2825,2828],{"href":2826,"rel":2827},"https:\u002F\u002Fgrafana.com\u002F",[67],"Grafana",". When combined, you are able to get a lot of monitoring data from your machine, and nice dashboards to analyze what's happening.",[50,2831,2832],{},[156,2833],{"alt":2834,"src":2835,"title":2836},"An example of Grafana Dashboard","\u002Fposts\u002Fown_your_data\u002Fgrafana_dashboard.png","An example of Grafana dashboard [Node Exporter](https:\u002F\u002Fgrafana.com\u002Fgrafana\u002Fdashboards\u002F1860-node-exporter-full\u002F).",[45,2838,2840],{"id":2839},"future-plans","Future plans",[87,2842,2844],{"id":2843},"dedicated-pc-build","Dedicated PC build",[50,2846,2847],{},"I mentioned it earlier but I reached the limits of my laptop with just a few services running on a dozen containers. It's no surprise in reality because it's old hardware, not ment for it, and running on a system also not optimized for this use case. So my next goal is to build a small case PC with lots of storage and RAM to have at home running 24\u002F7. When I find the perfect setup, I'll probably write a new post to describe it !",[87,2849,2851],{"id":2850},"more-power-more-services","More power, more services ?",[50,2853,2854,2855,2860,2861,2866],{},"Do I need more services, no. Do I want to explore and test new things, probably yes. I don't really know what I want to try next. I saw a lot of discussions about the enrichment of media libraries using the ",[63,2856,2859],{"href":2857,"rel":2858},"https:\u002F\u002Fwiki.servarr.com\u002F",[67],"Servarr"," stack, web server using ",[63,2862,2865],{"href":2863,"rel":2864},"https:\u002F\u002Fnginx.org",[67],"Nginx"," to host my own website maybe. Who knows !",[45,2868,2870],{"id":2869},"why-should-you-care-about-owning-your-data","Why should you care about owning your data ?",[87,2872,2385],{"id":2873},"privacy",[50,2875,2876,2877,2882],{},"Yes, why should you even consider having a second computer at home, right ? And why bother with services, programs and maintenance whereas for a few coins per month, everything can be stored in the cloud, by a giant tech company that saves all your data for you... You see the point coming, even if it's not the money (when you have thousands of pictures, storage price might become a problem), the problem is that you give up your privacy to these platforms. Not all of them are collecting data of course, I imagine a company like ",[63,2878,2881],{"href":2879,"rel":2880},"https:\u002F\u002Fproton.me\u002F",[67],"Proton"," offers their storage and services with privacy features for example but the best way to keep you things yours is just to keep it with you.",[87,2884,2886],{"id":2885},"have-better-control-on-what-you-own","Have better control on what you own",[50,2888,2889],{},"Do you know how many online accounts you have, where are all the passwords ? Where did you store your pictures over the years, Apple's ICloud or Google Photos maybe ? Some may be on social media now, and it's the only copy  you have left. Do you have also a music library ? And what about administrative files and folders you created over the years ? My point is, keeping your data at home will probably help you keep track of what you have and keep it safe.",[87,2891,2893],{"id":2892},"backups","Backups",[50,2895,2896,2897,2900],{},"I did not talk too much about backup. Obviously you have to make them, and the homelab can host a copy of your data. But as always, don't put everything in the same place. It is said that a good backup strategy is to have 3 copies of your data, in at least 2 places, with 1 which is not your home. The famous ",[77,2898,2899],{},"3-2-1"," rule to save all your data.",{"title":186,"searchDepth":200,"depth":200,"links":2902},[2903,2909,2914,2918],{"id":2651,"depth":200,"text":2652,"children":2904},[2905,2906,2907,2908],{"id":2655,"depth":228,"text":2656},{"id":2666,"depth":228,"text":2667},{"id":2706,"depth":228,"text":2707},{"id":2717,"depth":228,"text":2718},{"id":2732,"depth":200,"text":2733,"children":2910},[2911,2912,2913],{"id":2736,"depth":228,"text":2737},{"id":2753,"depth":228,"text":2754},{"id":2797,"depth":228,"text":2798},{"id":2839,"depth":200,"text":2840,"children":2915},[2916,2917],{"id":2843,"depth":228,"text":2844},{"id":2850,"depth":228,"text":2851},{"id":2869,"depth":200,"text":2870,"children":2919},[2920,2921,2922],{"id":2873,"depth":228,"text":2385},{"id":2885,"depth":228,"text":2886},{"id":2892,"depth":228,"text":2893},"2024-06-24","What can you do with the old computer you have in the closet ? I decided to install a Linux system and use it as a homelab. A background computer that can run a few services like serving media on your network, hosting your website, keep a copy of your data, and much more !","\u002Fposts\u002Fown_your_data\u002Ffeatured.svg",{},{"title":30,"description":2924},[2929,2384,2385],"Data","IF7lVvjeF3Depe5PY9Orfu9JMCX5jQyDGCKl-_mUj_k",{"id":2932,"title":26,"body":2933,"date":3197,"description":3198,"extension":882,"image":3199,"meta":3200,"navigation":423,"path":27,"readingTime":885,"seo":3201,"stem":28,"tags":3202,"__hash__":3206},"content\u002Fposts\u002Fmy_switch_to_linux.md",{"type":42,"value":2934,"toc":3175},[2935,2939,2943,2946,2950,2953,2957,2960,2964,2968,2971,2975,2978,2982,2997,3001,3005,3020,3027,3031,3058,3068,3076,3080,3095,3104,3108,3112,3121,3128,3134,3138,3147,3154,3157,3160,3164],[45,2936,2938],{"id":2937},"a-bit-of-history","A bit of History",[87,2940,2942],{"id":2941},"early-days-with-windows","Early days with Windows",[50,2944,2945],{},"I grew up with the computer of my father. At the time obviously, the main operating system out there was Windows by far. I learned to use a computer on Windows 95 and as I got older, used Windows systems until I think Windows Vista for personal use. I stopped using Windows on my main machine early 2013. Was I disappointed with the OS? Probably. But the main reason I decided to leave Windows was the hardware. As a student, I bought two laptops (cheap ones) and the hardware failed on me badly. The choices I made were probably not the best but I didn't have the money to buy a really good computer with great battery life, computing and gaming capabilities, etc...",[87,2947,2949],{"id":2948},"linux-dual-boot","Linux Dual Boot",[50,2951,2952],{},"I remember giving a try to linux using Dual Boot. A few friends of mine were just toying with the command line on Ubuntu. So I practiced a little and demystified the terminal utility as one can say. But it never got past a few programs and use cases, it was just for fun.",[87,2954,2956],{"id":2955},"when-apple-took-over","When Apple took over",[50,2958,2959],{},"Early 2013 then, I decided to take advantage of the discount my school granted to buy one of newly designed MacBook Pro. The promise for me was a better hardware and the macOS desktop features. I was impressed by the os capabilities and design as it was way better (in my humble opinion) than the Windows one. For the hardware, I decided to pay a little extra money to buy the latest SSD devices and have hardware that will last as long as possible. My idea was, that if you buy hardware that is already a few years old, it will be outdated quicker. I think it went well, because we are in 2024 and this MacBook Pro is still working! I did everything with it during almost 10 years. All the usual web browsing, some gaming (obviously not the latest games but still), I even used it during my PhD as my main machine to perform computations, and write my manuscript. In the end, apart from a small issue with the speakers, it was perfectly working in 2022 except Apple didn't ship updates for it anymore...",[45,2961,2963],{"id":2962},"i-felt-a-change-was-needed","I felt a change was needed !",[87,2965,2967],{"id":2966},"apples-update-policy","Apple's update policy",[50,2969,2970],{},"The OS update issue didn't arise in late 2022, it was much earlier. When trying to update, the system would have this warning saying that the hardware is not supported anymore (or at least badly) or even that the OS itself will be taking too much storage space for it to be installed on my machine. I was honestly angry at Apple for that. The computer is perfectly fine but as time goes by the system would be more and more outdated and unusable. At first, I didn't care but at some point some of the software were not even working. For example, Safari would refuse to display some websites because it's version was too old. Of course, You can always change the software you use to have ones that are actually working, but it did not felt right to me. I was just pushed away.",[87,2972,2974],{"id":2973},"windows-is-still-a-nogo","Windows is still a nogo",[50,2976,2977],{},"Because I also wanted a new PC to install some recent games, why not come back to Windows? That one is easy, I have the newest Windows 11 version at work, and it's a nightmare. No way I use this if I have a choice.",[87,2979,2981],{"id":2980},"what-about-those-new-m1-imacs","What about those new M1 iMacs ?",[50,2983,2984,2985,2990,2991,2996],{},"I took a look at the new M1 ",[63,2986,2989],{"href":2987,"rel":2988},"https:\u002F\u002Fwww.apple.com\u002Fimac\u002F",[67],"iMacs"," that were just released at the time. The looked like powerful machines, clearly easier to integrate in your home interior than a clunky tower and screen. Unfortunately, they will have, in time, the anticipated end of life decided in an office far away by the giant company making them. I was seriously considering it but I also wanted to see what Linux had to offer, 10 years after our last encounter. I found out that it was as powerful as ever, could even run games now thanks to ",[63,2992,2995],{"href":2993,"rel":2994},"https:\u002F\u002Fgithub.com\u002FValveSoftware\u002FProton",[67],"Valve's Proton"," contributions and that there was basically no use case left out now. A friend of mine was even installing Linux Mint on his PC and all that was enough to finally attract me.",[45,2998,3000],{"id":2999},"linux-it-is-then-but-how","Linux it is then, but how ?",[87,3002,3004],{"id":3003},"searching-for-advice","Searching for advice",[50,3006,3007,3008,3013,3014,3019],{},"A good part of the decision was made because of the content of good quality I found about Linux, distributions, open source software, etc... A few YouTube content creators caught my attention rapidly, namely Nick from ",[63,3009,3012],{"href":3010,"rel":3011},"https:\u002F\u002Fwww.youtube.com\u002F@TheLinuxEXP",[67],"The Linux Experiment"," and Jay from ",[63,3015,3018],{"href":3016,"rel":3017},"https:\u002F\u002Fwww.youtube.com\u002F@LearnLinuxTV",[67],"Learn Linux TV",". Both of them, I still watch two years later. You can find on these channels everything you need to know about the new system you are installing. How to install it, what to expect, what will work and what may not work out of the box, you name it. They have a variety of subject that they cover and update regularly so it's in my opinion a pretty good start.",[50,3021,3022],{},[156,3023],{"alt":3024,"src":3025,"title":3026},"Distributions Icons","\u002Fposts\u002Fmy_switch_to_linux\u002Fdistros.png","Icon set made by [Walruz](https:\u002F\u002Fwww.reddit.com\u002Fuser\u002Fwalrusz\u002F)",[87,3028,3030],{"id":3029},"distributions","Distributions",[50,3032,3033,3034,3037,3038,3041,3042,3045,3046,3049,3050,3053,3054,3057],{},"Here is the famous question: which distribution will you choose for your system? When choosing a distribution, from what I understood then, you gain access to a pool of ",[82,3035,3036],{},"applications"," and a given ",[82,3039,3040],{},"Desktop Environment",". I know it's a simplified view of it but when you're a beginner, that is what it looks like. The number of distributions is overwhelming, but you can still see through their lineage. Roughly, you have ",[77,3043,3044],{},"Debian"," children, with after that their ",[77,3047,3048],{},"Ubuntu"," children (which are many) and you have also ",[77,3051,3052],{},"Arch"," children. The main difference being the way updates are handled. Arch children will have a short cycle, they are called ",[77,3055,3056],{},"rolling release",", whereas Debian like have bigger chunks of upgrades. You then have to weight the fact that you want up-to-date software and dealing with regular updates that can be disrupting your system. In short: stability VS availability.",[50,3059,3060,3061,619,3064,3067],{},"Second main topic when choosing a distribution is the Desktop Environment it is shipping with. The two main ones are ",[77,3062,3063],{},"Gnome",[77,3065,3066],{},"KDE",". Again, you have pros and cons for both of them, but it seemed clear to me that the way they were developed is quite different. Gnome has its own set of apps and a clearly defined aspect for the desktop. KDE on the other side has also it's apps but regarding the aspect, you have a lot of customizations available. I read also that it is said to be less stable than Gnome but being able to customize anything on your desktop is attracting.",[50,3069,3070,3071,152],{},"One thing to keep also in mind is that depending on the distribution chosen, some software may be difficult to access. It is less and less true with the rise of the Flatpak format, but it can still be an issue.\nIf you want to have a look at the distros available, everything is on ",[63,3072,3075],{"href":3073,"rel":3074},"https:\u002F\u002Fdistrowatch.com\u002F",[67],"Distrowatch",[87,3077,3079],{"id":3078},"my-first-choice-and-experiences","My first choice and experiences",[50,3081,3082,3083,3088,3089,3094],{},"I finally bought a computer that I configured myself on ",[63,3084,3087],{"href":3085,"rel":3086},"https:\u002F\u002Fwww.topachat.com\u002Faccueil\u002Findex.php",[67],"Top Achat",". It's a French shop and I mention them here because I had a few discussions with their customer support, and they were perfect. For the distribution, I remembered friends talking about Archlinux as if it was the only distribution worth it in this world but I thought it was too complicated for me at start. I wanted a rolling release but I needed to learn how things work first. And so I came to ",[63,3090,3093],{"href":3091,"rel":3092},"https:\u002F\u002Fmanjaro.org\u002F",[67],"Manjaro",". It's an Arch derivative, with 3 options for the Desktop Environment. I chose KDE for its customizations capabilities and I never regretted it! I still use KDE now and I find it to be very stable for my day-to-day use. As an Arch derivative, Manjaro has a short release cycle, but keeps back a few updates to be able to test for stability before making it available to the public. I thought it was a good compromise.",[50,3096,3097,3098,3103],{},"Looking back at this choice, I think maybe ",[63,3099,3102],{"href":3100,"rel":3101},"https:\u002F\u002Fendeavouros.com\u002F",[67],"EndeavourOS"," would be a better starting point, it has the same Arch base but has a better reputation as a community than Manjaro.",[45,3105,3107],{"id":3106},"two-years-later-what-has-changed","Two years later, what has changed.",[87,3109,3111],{"id":3110},"i-gained-experience-and-confidence","I gained experience and confidence",[50,3113,3114,3115,3120],{},"I used Manjaro for a complete year. I never had any major issue with the system, except once when after an update, I got a black screen. I managed to reinstall the Nvidia drivers that I needed using only the command line and fixed the problem myself just by looking for the solution on the internet. It felt so good to be able to do that, even if I thought for a minute that I would have to start over and reinstall everything! I installed Steam pretty easily and could play as much games as I want. Valve is making it easy for Linux users, you just need to check if the game you want to buy is well-supported by proton on the ",[63,3116,3119],{"href":3117,"rel":3118},"https:\u002F\u002Fwww.protondb.com\u002F",[67],"Proton DB"," website, and you're good to go in my experience.",[50,3122,3123,3124,3127],{},"I could do everything I needed with my Manjaro system, all the usual browsing, administrative tasks (I don't have a printer, so I can't tell if that would work, but signing PDFs was easy), gaming, coding, etc...I was enjoying the ",[82,3125,3126],{},"KDE Plasma"," desktop, tweaking it to my liking and spending probably too much time customizing things using the command line and config files.",[50,3129,3130,3131,152],{},"I loved learning how my system is working and I grew familiar with the command line utilities during this first year. And then I felt ready for the next step: installing ",[82,3132,3133],{},"Archlinux",[87,3135,3137],{"id":3136},"changing-the-distro","Changing the distro",[50,3139,3140,3141,3146],{},"Before I continue, I must say that I get the whole \"I use Arch btw\" thing. Because the system you're trying to install is kept at bare minimum, without any graphical interface to work with at start, it feels like a challenge. Keeping it well maintained can also be if you're not careful. But nowadays, with the immense quality of the ",[63,3142,3145],{"href":3143,"rel":3144},"https:\u002F\u002Fwiki.archlinux.org\u002F",[67],"Archwiki",", the installation is accessible and maintaining the system is not that hard.",[50,3148,3149],{},[156,3150],{"alt":3151,"src":3152,"title":3153},"Archlinux Logo","\u002Fposts\u002Fmy_switch_to_linux\u002Farch.png","The official Archlinux logo from [archlinux.org](https:\u002F\u002Farchlinux.org\u002F)",[50,3155,3156],{},"Nevertheless, it is still a challenge when you install it for the first time and you don't know all about what is going on behind the scene. So I practiced the installation process on VMs, and failed a few times. Forgot to install bootloader, can't boot. Forgot the wireless network interface, no cable, no internet at reboot. It has been difficult but I finally managed to install Arch on two different VMs and then went for it for real. I did a backup of everything (you should too, always) and it went all smoothly! I'm writing these lines from this same Arch install, still working after more than a year.",[50,3158,3159],{},"What I want to say is that it is rewarding. You have control over everything on your computer, you have the choice for every little piece of software you want to install. Nothing has been chosen for you beforehand, or almost nothing, just the vitals.",[87,3161,3163],{"id":3162},"whats-next","What's next ?",[50,3165,3166,3167,3170,3171,3174],{},"I am at home with Arch on my main machine now. Everything is in place and I have no trouble in my day to day use or even for maintenance. I would still like to try other distros, and for that I erased macOS from my old laptop. It was not working so well anymore anyway. I first installed ",[82,3168,3169],{},"Linux Mint",". It is an easy to use distro for the rest of the family that may use it. It's also definitely easier to maintain and won't suck up as much time as Arch. Maybe I'll switch again on this machine. I heard a great many things about ",[82,3172,3173],{},"Fedora"," and may give it a try. On the other hand, an Ubuntu based distro like Mint can be a way to get familiar with this very big family of distributions. After all, it's one of the main system running the world's servers.",{"title":186,"searchDepth":200,"depth":200,"links":3176},[3177,3182,3187,3192],{"id":2937,"depth":200,"text":2938,"children":3178},[3179,3180,3181],{"id":2941,"depth":228,"text":2942},{"id":2948,"depth":228,"text":2949},{"id":2955,"depth":228,"text":2956},{"id":2962,"depth":200,"text":2963,"children":3183},[3184,3185,3186],{"id":2966,"depth":228,"text":2967},{"id":2973,"depth":228,"text":2974},{"id":2980,"depth":228,"text":2981},{"id":2999,"depth":200,"text":3000,"children":3188},[3189,3190,3191],{"id":3003,"depth":228,"text":3004},{"id":3029,"depth":228,"text":3030},{"id":3078,"depth":228,"text":3079},{"id":3106,"depth":200,"text":3107,"children":3193},[3194,3195,3196],{"id":3110,"depth":228,"text":3111},{"id":3136,"depth":228,"text":3137},{"id":3162,"depth":228,"text":3163},"2024-05-29","I have used computers all my life, being for work or pleasure like many of my generation. I used the 3 major OS on the market and I have to say that the best one for me is Linux and here is why I switched away from the others.","\u002Fposts\u002Fmy_switch_to_linux\u002Ffeatured.svg",{},{"title":26,"description":3198},[3203,3204,3205],"Linux","OS","FOSS","HZsb2JzZ58cV1XktO94h_Y52GiV452ykBQUatN33DOI",1776107989676]