Click here to Skip to main content
16,004,686 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hello all,

I am currently moving to GIT. It was time I know.

The way I want to work it is the following one:
1. I have a NAS where I installed GIT.
1.1. I can create folders in the NAS and via SSH create repositories in there.
2. I have installed GIT for windows in my laptop.
3. I have also installed TortoiseGIT in my laptop.

My plan is to create a repository for each part of the project (one for the PLC code, another one for the robot code...) that way, if I must work with someone else that is specialized only in one area, I don't have to share everything with them.

Now, I can clone those remote folders to local ones, stage files, commit...

And now the backups...
I plan to create a backup of all my local repositories into the NAS.
This will automatically run every night at a certain time or will be triggered manually by myself.

And the NAS repositories will also be backupped every night.

And the questions
Am I missing something?
Do the GIT plan and backup plan look OK?
Any recommendation?

After the comment from @0x01AA, I have another question.
I work in big projects in which there are multiple machines and programmers involved.
I have thought of doing one repository for each machine and for each controller to be programmed.
But then... how should I cope with global documents for the project?
Should I create another repository to put those documents?

Thank you all!

What I have tried:

Installed the server in the NAS, GIT in windows, Tortoise and tried to clone a repo. Even started to commit things there.

I am preparing the backup strategy now, but I wanted to ask for advise before going on.
Posted
Updated 18-Aug-24 1:19am
v2
Comments
0x01AA 18-Aug-24 6:52am    
If somthing is 'important for the project', why not adding it to the repository? Size?
Joan M 18-Aug-24 7:12am    
hmmm... yes, that way I would be able to clone it all easily.
Thanks.
PIEBALDconsult 18-Aug-24 14:25pm    
From what and why?
On my last gig we used TFS and it was great. Then we were forced to switch to GIT and we could no longer develop utilities to support the flow and it was horrible.

Never switch anything. Whatever you start with, stick with it.
Joan M 18-Aug-24 14:50pm    
I am moving from the automatic backups of my NAS which are making a snapshot of every change in every file in the projects folder for me. Which had to be deactivated while working due to file locks/permissions. That worked fine for me, but only for me, it was not possible to work as a real team with my customers.
This method is still much more advanced than what I usually see: folders with _last, _this_is_really_the_last, _NOW_THIS_IS_THE_LAST, _20240521_1, 20240521_1_1335... you know what I mean.

I have been speaking with customers where I work often and that work in small teams up to 10 people to start using a version control to be able to get a better control of what they are doing. And soon they probably will apply this.
I should be able to follow.

Lucky me, it is not a big deal no changing things, I've been developing machines using the same folder structures for years now... the only issue I can see is the shared documents across the multiple machines from the same line, but nothing is perfect.

All this said, given at the end of each project we move to the final customer to make the startup of everything, sometimes there is no internet connection there, so having something like Git which allows you to have one local and one remote repository is an exceptionally good thing.

In the industrial automation world (in which everything is at least 10 years behind) it will be a godsend using Git.

In any case, I've looked to TFS page and being able to enforce workflows... seems nice and handy. But... maybe in 10 years... :D

Thanks for your comment!

GIT is not your issue, it's the work flow you have to define. GIT could be SVN or any other version control system. Your backup process is fine. Now, I use SVN, and it lives in a VM, so copy paste the VM, backup solved.

You're already heading in this direction, but I would state ahead of time, you have an interesting situation. You have multiple contributors and you don't want them to mingle. So, separate repositories with appropriate permissions and write scripts to build stuff. My last project became an SVN disaster. There is 5 repositories that feed the product build. So, I wrote scripts to work with all of the repos, and like is good.

Documentation: this has been a conundrum for me. If I have a document that explains how to build the product, does it go into the product source control area (we have a documentation folder) or does it go into a separate area that is product agnostic? My team never did resolve this, so now we have documentation in two different places. Avoid this and focus on consistency. For the high level getting started stuff, it's project specific but not too detailed. So it goes in the general doc repo. For specific product items, I put it in the product repo. It goes against the grain of put all doc in the doc folder, but if you think about it, why would a developer working on product A care about doc for product B?
 
Share this answer
 
Comments
0x01AA 18-Aug-24 9:20am    
"Avoid this and focus on consistency" -> +5
Joan M 18-Aug-24 9:42am    
In my case, there are global documents with, for example the IP addresses of all the involved machines in that line, which could be divided in small documents, but it makes it more complicated to remember limits and free addresses...

There are also the documents that we use to configure a specific device (scanner, camera, temperature sensor...) those documents affect the projects were the devices are, but they could affect more than one project/machine in that line.
Having those documents separated makes it much more complicated to have consistency as we can have different versions everywhere.

Maybe all the repositories could have access to another repository that would handle documentation alone. Would that be possible?

Otherwise I can leave those documents out of the git thing... and just keep them backed up.
I have read this question a few times and I keep coming back to the same basic question, why? I understand that you want to work with git and that parts fine, but the bit I am struggling to get my head around is why you think you need to host this locally? What do you think you are going to get that you can't get from a standard GIT set up. It's entirely possible for you to create private repositories that you invite people in to collaborate on as needed, and you don't have the headache of devising your own backup capabilities.

What is the purpose for needing to host your own version? What is it that you think you aren't getting from standard Github? I ask this because maintaining your own github repository, including security patching it, is an absolute PITA (I speak from painful experience here).
 
Share this answer
 
Comments
Joan M 19-Aug-24 6:36am    
First of all, thank you for posting!

I am an industrial programmer; this means from time to time I am travelling and spend from one week to a month (or even more) out of the office in companies with rigid IT policies, sometimes without internet connection and I want to keep track of the changes made.
That's why I want the local repository, that later I want to synchronize with the remote one.
That's the main advantage of GIT compared to other solutions like SVN.

There are also companies that simply don't allow putting their code in the cloud/internet.

Regarding patching and security... wouldn't update the GIT package my NAS provider offers and the local Windows GIT installation update and patch it all?

I have been looking at Gitea and Gitlab (both offer a super nice GUI for GIT), but both must be installed using docker, link to a database making the important folders inside the docker container public to be backupped... so I preferred to stay with what is native in my NAS.
If you want more control of your Git server you might be interested in Gitea:
GitHub - go-gitea/gitea: Git with a cup of tea! Painless self-hosted all-in-one software development service, including Git hosting, code review, team collaboration, package registry and CI/CD[^]
We are running it in a local network on Windows 10 with a PostgreSql database for years now and are very pleased with it.
It has a GitHub like interface which even our new developers can use quite easily.

[Edit] I now see in your comments that you think Docker is necessary for Gitea, but this is optional, we just run the single executable, that's all ...
 
Share this answer
 
v2
Comments
Joan M 20-Aug-24 13:29pm    
Yes, it looks very interesting... will have to check it, having a nice interface like GitHub and at the same time having the local server seems a great combo.
In my case I would need the docker because I would install it in our NAS.

Thank you for posting!
Joan M 21-Aug-24 5:44am    
I have one question Rick, does Gitea runs as a standalone GIT server or it requires a Git server to operate?
Thank you in advance!
RickZeeland 21-Aug-24 6:17am    
Git is required, see: System Requirements
https://docs.gitea.com/
Joan M 21-Aug-24 6:36am    
Great! this means I don't have to make strange things to sync two different servers to make it work.
At the end this becomes a good-looking GUI with some extra sugar to make things easier.
Will definitely try it.
Thanks!

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900