*New 12.1 series Release:
2020-03-24: XigmaNAS 12.1.0.4.7389 - released

*New 11.3 series Release:
2020-03-24: XigmaNAS 11.3.0.4.7383 - released!


We really need "Your" help on XigmaNAS https://translations.launchpad.net/xigmanas translations. Please help today!

Producing and hosting XigmaNAS costs money. Please consider donating for our project so that we can continue to offer you the best.
We need your support! eg: PAYPAL

Boot Environments TUI Manager

Forum rules
Set-Up GuideFAQsForum Rules
Post Reply
User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Boot Environments TUI Manager

#1

Post by JoseMR »

Hello, I developed a Boot Environments Manager bemanager, utility wrap around the well known beadm utility for FreeBSD.

This little tool wraps the most common task for "beadm" plus compressed "zfs send/receive"n features as follows:
1) Activate Boot environments
2) Create Boot environments
3) Mount Boot environments
4) Unmount Boot environments
5) Rename Boot environments
6) Backup Boot environments
7) Restore Boot environments
8) Snapshot Boot environments
9) Destroy Boot environments

An introduction to Boot Environments by Sławomir Wojciech Wojtczak(vermaden).

The project development lives at GitHub

A manual is on TODO list, however the options and the sample config file are self explanatory.

Note: bemanager works on any *BSD platform supporting the beadm utility.

Any feedback on how I can improve this little tool is much very welcome.

Regards
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

User avatar
Maurizio
Starter
Starter
Posts: 62
Joined: 05 Jul 2018 21:49
Location: Linate (MIlan)
Status: Offline

Re: Boot Environments Manager

#2

Post by Maurizio »

Thank you JoseMR, I am just testing your utility :)
Some observations:
- In the version string put the bemanager version eg. "Boot Environments Manager 0.4.5b for beadm 1.2.8 2018/06/13"
- In backup/restore print the name of the BE and not the name of the snapshot, if possible.

Thanks again for your work.
XigmaNAS 12.1.0.4 on Dell R710 144GB RAM - RootOnZFS zroot on 2x 64GB 15k HDDs in mirror, zdata on 3x 1TB SSD in RAIDZ1.
2x XigmaNAS 11.2.0.4 - RootOnZFS on HPE Proliant Microserver gen10 X3216 - 3x 4TB WD RED. In mirror with zrep.

User avatar
Maurizio
Starter
Starter
Posts: 62
Joined: 05 Jul 2018 21:49
Location: Linate (MIlan)
Status: Offline

Re: Boot Environments Manager

#3

Post by Maurizio »

Running the util on my FreeBSD server I have some strange ouptut, where the last 6 elements are all "services":

Code: Select all

     ┌──────────────────────────Select Boot Environment─────────────────────────────┐
     │ Select any Boot Environment you which to backup, use [up] [down] keys to     │
     │ navigate the menu then select a item with the [spacebar].                    │
     │ ┌─────────────────────↑(-)─────────────────────────────────────────────────┐ │
     │ │                 ( ) services@zrep_00028a                                 │ │
     │ │                 ( ) services@zrep_00028b                                 │ │
     │ │                 ( ) services@zrep_00028c                                 │ │
     │ │                 ( ) services@zrep_00028d                                 │ │
     │ │                 ( ) services@zrep_00028e                                 │ │
     │ │                 ( ) services                                             │ │
     │ │                 ( ) services                                             │ │
     │ │                 ( ) services                                             │ │
     │ │                 ( ) services                                             │ │
     │ │                 ( ) services                                             │ │
     │ └──────────────────────────────────────────────────────────────────100%────┘ │
     │                                                                              │
     │                                                                              │
     ├──────────────────────────────────────────────────────────────────────────────┤
     │                       <  OK  >            <Cancel>                           │
     └──────────────────────────────────────────────────────────────────────────────┘

The output of zfs list -rt snapshot pool_ssd is:

Code: Select all

NAME                                               USED  AVAIL  REFER  MOUNTPOINT
pool_ssd/ROOT/release11@2018-06-25-17:16:04        178M      -  29.0G  -
pool_ssd/ROOT/release11@2018-06-28-12:40:13       48.1M      -  29.0G  -
pool_ssd/ROOT/release11@2018-07-01-10:33:02       5.25M      -  29.6G  -
pool_ssd/ROOT/release11@2018-07-03_23.15.00--14d  4.86M      -  29.6G  -
pool_ssd/ROOT/release11@2018-07-04_23.15.00--14d  4.22M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-05_23.15.00--14d  3.28M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-06_23.15.00--14d  3.59M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-07_23.15.00--14d  3.28M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-08_23.15.00--14d  3.03M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-09_23.15.00--14d  37.1M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-10_23.15.00--14d  4.79M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-11_23.15.00--14d  1.59M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016a               1.25M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-12_23.15.00--14d  1.36M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016b               1.36M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-13_23.15.00--14d  1.27M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016c               1.23M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-14_23.15.00--14d  2.70M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-15_23.15.00--14d  3.43M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-16_23.15.00--14d  1.59M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016d               1.38M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-17_23.15.00--14d  1.31M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016e               1.29M      -  29.7G  -
pool_ssd/root@zrep_000041                         12.5M      -   564M  -
pool_ssd/root@zrep_000042                         12.2M      -   564M  -
pool_ssd/root@zrep_000043                         12.3M      -   564M  -
pool_ssd/root@zrep_000044                         12.2M      -   554M  -
pool_ssd/root@zrep_000045                         11.8M      -   554M  -
pool_ssd/zfsguru@zrep_00028b                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028c                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028d                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028e                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028f                          0      -   160K  -
pool_ssd/zfsguru/download@zrep_00028b                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028c                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028d                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028e                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028f                 0      -   144K  -
pool_ssd/zfsguru/services@zrep_00028b                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028c                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028d                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028e                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028f                 0      -   152K  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028b         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028c         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028d         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028e         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028f         0      -  1.77M  -
XigmaNAS 12.1.0.4 on Dell R710 144GB RAM - RootOnZFS zroot on 2x 64GB 15k HDDs in mirror, zdata on 3x 1TB SSD in RAIDZ1.
2x XigmaNAS 11.2.0.4 - RootOnZFS on HPE Proliant Microserver gen10 X3216 - 3x 4TB WD RED. In mirror with zrep.

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments Manager

#4

Post by JoseMR »

Thanks Maurizio for your feedback, really much appreciated, regarding the snapshot@snapname is because you can have several BE's with the same name in the list, but what you will look is for the latest snapshot for a given BE so you can have a hint about the changes based on the date/snapshot name, this was done on purpose for the zfs send/revc only though, thus removing the snap string from the BE will leave the user in the shadows, see below imge for example, this may become more confusing if every BE name has no @snapname string IMO?:
BEM-Snap-Name.PNG

As for the services@zrep this is really odd as the tool is using "beadm list" and "zfs list" to gather required information, please if you can provide output for "zfs list -t snaphot" and "beadm list -a" I could get idea what is causing those unrelated services to appear in the BE list.

Regards
You do not have the required permissions to view the files attached to this post.
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments Manager

#5

Post by JoseMR »

Hi, you can upgrade bemanager to version 0.4.6, this version print BEM version on top, also will try to search snapshots containing for example "zroot/ROOT" exact match only and ignore unrelated.

Please let me know if working for you.

Regards

EDIT: This should be solved on bemanager version 0.4.7, I hope so.
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

User avatar
Maurizio
Starter
Starter
Posts: 62
Joined: 05 Jul 2018 21:49
Location: Linate (MIlan)
Status: Offline

Re: Boot Environments Manager

#6

Post by Maurizio »

Hi JoseMR,
even with the last version I have the same output. I submit the output of the 2 commands: "beadm list -a" and "zfs list -t snapshot | grep pool_ssd". The pool_ssd is the OS pool.
Regards.

Code: Select all

 Boot Environments Manager 0.4.7b for beadm 1.2.3 2015/09/08                                        
 ────────────────────────────────────────────────────────────────────────────────────────────────── 
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
         ┌──────────────────────────Select Boot Environment─────────────────────────────┐           
         │ Select any Boot Environment you which to backup, use [up] [down] keys to     │           
         │ navigate the menu then select a item with the [spacebar].                    │           
         │ ┌─────────────────────↑(-)─────────────────────────────────────────────────┐ │           
         │ │                 ( ) services@zrep_00028c                                 │ │           
         │ │                 ( ) services@zrep_00028d                                 │ │           
         │ │                 ( ) services@zrep_00028e                                 │ │           
         │ │                 ( ) services@zrep_00028f                                 │ │           
         │ │                 ( ) services@zrep_000290                                 │ │           
         │ │                 ( ) services                                             │ │           
         │ │                 ( ) services                                             │ │           
         │ │                 ( ) services                                             │ │           
         │ │                 ( ) services                                             │ │           
         │ │                 ( ) services                                             │ │           
         │ └──────────────────────────────────────────────────────────────────100%────┘ │           
         │                                                                              │           
         │                                                                              │           
         ├──────────────────────────────────────────────────────────────────────────────┤           
         │                       <  OK  >            <Cancel>                           │           
         └──────────────────────────────────────────────────────────────────────────────┘           
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
                                                                                                    
[0] 0:top  1:mc  2:mc  3:bash- 4:cdialog*                               clover-nas2  14:03 19-Jul-18

Code: Select all

# beadm list -a
BE/Dataset/Snapshot                                Active Mountpoint  Space Created

release11
  pool_ssd/ROOT/release11                          NR     /           36.0G 2017-07-14 11:11

upgrade
  pool_ssd/ROOT/upgrade                            -      -            8.0K 2018-06-28 12:40
    release11@2018-06-28-12:40:13                  -      -          941.0M 2018-06-28 12:40

pkg.old
  pool_ssd/ROOT/pkg.old                            -      -            8.0K 2018-07-01 10:33
    release11@2018-07-01-10:33:02                  -      -           94.6M 2018-07-01 10:33

pkg
  pool_ssd/ROOT/pkg                                -      -            8.0K 2018-07-18 10:24
    release11@2018-07-18-10:24:01                  -      -           16.0M 2018-07-18 10:24

pre-nextcloud
  pool_ssd/ROOT/pre-nextcloud                      -      -            8.0K 2018-07-18 17:04
    release11@2018-07-18-17:04:58                  -      -           13.6M 2018-07-18 17:04

Code: Select all

# zfs list -t snapshot | grep pool_ssd
NAME                                               USED  AVAIL  REFER  MOUNTPOINT
pool_ssd/ROOT/release11@2018-06-25-17:16:04        178M      -  29.0G  -
pool_ssd/ROOT/release11@2018-06-28-12:40:13       48.1M      -  29.0G  -
pool_ssd/ROOT/release11@2018-07-01-10:33:02       5.25M      -  29.6G  -
pool_ssd/ROOT/release11@2018-07-03_23.15.00--14d  4.86M      -  29.6G  -
pool_ssd/ROOT/release11@2018-07-04_23.15.00--14d  4.22M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-05_23.15.00--14d  3.28M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-06_23.15.00--14d  3.59M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-07_23.15.00--14d  3.28M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-08_23.15.00--14d  3.03M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-09_23.15.00--14d  37.1M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-10_23.15.00--14d  4.79M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-11_23.15.00--14d  1.59M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016a               1.25M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-12_23.15.00--14d  1.36M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016b               1.36M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-13_23.15.00--14d  1.27M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016c               1.23M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-14_23.15.00--14d  2.70M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-15_23.15.00--14d  3.43M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-16_23.15.00--14d  1.59M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016d               1.38M      -  29.7G  -
pool_ssd/ROOT/release11@2018-07-17_23.15.00--14d  1.31M      -  29.7G  -
pool_ssd/ROOT/release11@zrep_00016e               1.29M      -  29.7G  -
pool_ssd/root@zrep_000041                         12.5M      -   564M  -
pool_ssd/root@zrep_000042                         12.2M      -   564M  -
pool_ssd/root@zrep_000043                         12.3M      -   564M  -
pool_ssd/root@zrep_000044                         12.2M      -   554M  -
pool_ssd/root@zrep_000045                         11.8M      -   554M  -
pool_ssd/zfsguru@zrep_00028b                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028c                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028d                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028e                          0      -   160K  -
pool_ssd/zfsguru@zrep_00028f                          0      -   160K  -
pool_ssd/zfsguru/download@zrep_00028b                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028c                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028d                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028e                 0      -   144K  -
pool_ssd/zfsguru/download@zrep_00028f                 0      -   144K  -
pool_ssd/zfsguru/services@zrep_00028b                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028c                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028d                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028e                 0      -   152K  -
pool_ssd/zfsguru/services@zrep_00028f                 0      -   152K  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028b         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028c         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028d         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028e         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028f         0      -  1.77M  -
XigmaNAS 12.1.0.4 on Dell R710 144GB RAM - RootOnZFS zroot on 2x 64GB 15k HDDs in mirror, zdata on 3x 1TB SSD in RAIDZ1.
2x XigmaNAS 11.2.0.4 - RootOnZFS on HPE Proliant Microserver gen10 X3216 - 3x 4TB WD RED. In mirror with zrep.

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments Manager

#7

Post by JoseMR »

Thanks Maurizio for the support, the issues are the last 5 lines on your last post indeed, there is a "/" at F3

Code: Select all

pool_ssd/zfsguru/services/9.1-005@zrep_00028b         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028c         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028d         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028e         0      -  1.77M  -
pool_ssd/zfsguru/services/9.1-005@zrep_00028f         0      -  1.77M  -

Those are part of the pool_ssd(zfs root) but the @ is not right at field 3, as I was developing the tool based on the FreeBSD installer defaults, but indeed this may cause problems for customs installs, I will find way and tell the utility to explicitly look for the "@" indeed.

I will create similar environments in my setup in order to test then release new version.

Regards

P.S I think since the tool is all about to ease Boot Environments related stuff(not a general zfs send/recv tool), maybe I can just gather BE related snaps from

Code: Select all

beadm list -s
for best results and less clutter indeed, will make some test right away.

Again thanks for the kind support.
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments Manager

#8

Post by JoseMR »

Hi, as for bemanager version 0.4.8 only active zfs bootfs snapshots will be displayed as it should be on the first place. :)

Feedback is always welcome.

Regards
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

User avatar
Maurizio
Starter
Starter
Posts: 62
Joined: 05 Jul 2018 21:49
Location: Linate (MIlan)
Status: Offline

Re: Boot Environments Manager

#9

Post by Maurizio »

HI JoseMR,
now it works!
Regards
XigmaNAS 12.1.0.4 on Dell R710 144GB RAM - RootOnZFS zroot on 2x 64GB 15k HDDs in mirror, zdata on 3x 1TB SSD in RAIDZ1.
2x XigmaNAS 11.2.0.4 - RootOnZFS on HPE Proliant Microserver gen10 X3216 - 3x 4TB WD RED. In mirror with zrep.

ThomasC
NewUser
NewUser
Posts: 3
Joined: 12 Feb 2015 02:55
Status: Offline

Re: Boot Environments TUI Manager

#10

Post by ThomasC »

I could be using bemanager incorrectly but I've maybe found a bug/quirk. Please note though that I can reproduce the problem even when using beadm commands as well.

PROBLEM: when renaming the currently booted Boot Environment and then leaving it as-is active (or activating it again), then on the next reboot it looks for the original BE name instead of the renamed BE, and the boot fails (at which point I reboot and use the early "Welcome to XigmaNAS" boot prompt to select another good BE).

For this problem scenario even if I stop using bemanager and run "beadm rename old_name new_name" followed by "beadm activate new_name" the last command returns "Already activated" (I'm guessing the root of the problem, since it doesn't activate the new name). If instead, I rename then activate a BE that is not currently the booted BE then all works great on the next reboot.

My environment:
bemanager 0.6.4b
beadm 1.2.9
XigmaNAS 11.2.0.4.6400

I have no idea if this problem occurs in the XigmaNAS 12.x beta branch. Perhaps, bemanager could check if the BE attempting to be renamed is the currently booted BE, then produce a warning/explanation and prevent that rename scenario from even happening?

PS - I have been using the Embedded platform for roughly 4 years and at times tested the legacy Full RootOnUFS platform, but I am loving the whole Full RootOnZFS platform/mindset/capabilities as my main platform now, thank you!!!
XigmaNAS 11.2.0.4.6400 Full RootOnZFS , Supermicro A1SRM-2558F , 16GB ECC , pool: 4 x 3TB WD Red NAS 3.5" , root: 1 x 1TB WD Red NAS 2.5"

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments TUI Manager

#11

Post by JoseMR »

ThomasC wrote:
14 Feb 2019 20:22
I could be using bemanager incorrectly but I've maybe found a bug/quirk. Please note though that I can reproduce the problem even when using beadm commands as well.

PROBLEM: when renaming the currently booted Boot Environment and then leaving it as-is active (or activating it again), then on the next reboot it looks for the original BE name instead of the renamed BE, and the boot fails (at which point I reboot and use the early "Welcome to XigmaNAS" boot prompt to select another good BE).

For this problem scenario even if I stop using bemanager and run "beadm rename old_name new_name" followed by "beadm activate new_name" the last command returns "Already activated" (I'm guessing the root of the problem, since it doesn't activate the new name). If instead, I rename then activate a BE that is not currently the booted BE then all works great on the next reboot.

My environment:
bemanager 0.6.4b
beadm 1.2.9
XigmaNAS 11.2.0.4.6400

I have no idea if this problem occurs in the XigmaNAS 12.x beta branch. Perhaps, bemanager could check if the BE attempting to be renamed is the currently booted BE, then produce a warning/explanation and prevent that rename scenario from even happening?

PS - I have been using the Embedded platform for roughly 4 years and at times tested the legacy Full RootOnUFS platform, but I am loving the whole Full RootOnZFS platform/mindset/capabilities as my main platform now, thank you!!!

Hi ThomasC welcome to the forum, thanks for trying out RootOnZFS and bemanager, also nice catch on spotting and reporting the bug here.

Since I don't usually rename current BE's I haven't experienced this myself, however I definitely reproduced this with the beadm command like you, and found that this bug is related to the RootOnZFS platform(fixed).

The problem is that wen renaming the current BE, the beadm utility does update the "bootfs" property as it should, but "vfs.root.mountfrom" on the /boot/loader.conf stay pointing to the previous BE, however since bootfs property now takes care of the boot processes, I will remove the legacy loader.con offending line.

Permanent Solution: Just open the "/boot/loader.conf" and comment or simply remove this line since no longer needed, note that this is a one time fix and new BE's will follow changes.


P.S. I will update the RootOnZFS Platform installer today, and sent to developers for next release, again thank you for the bug report.
Edit: This has been fixed as of Revision 6473.

Regards
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

ThomasC
NewUser
NewUser
Posts: 3
Joined: 12 Feb 2015 02:55
Status: Offline

Re: Boot Environments TUI Manager

#12

Post by ThomasC »

Hi JoseMR, thank you for all your work to fix this and your informative explanation. :mrgreen: I configured your permanent fix and my testing has been working reliably since then.
XigmaNAS 11.2.0.4.6400 Full RootOnZFS , Supermicro A1SRM-2558F , 16GB ECC , pool: 4 x 3TB WD Red NAS 3.5" , root: 1 x 1TB WD Red NAS 2.5"

User avatar
JoseMR
Hardware & Software Guru
Hardware & Software Guru
Posts: 1043
Joined: 16 Apr 2014 04:15
Location: PR
Contact:
Status: Offline

Re: Boot Environments TUI Manager

#13

Post by JoseMR »

This utility has been updated to bemanager v0.7.1, now includes an option to make the backup task usability even simpler when enabled.

Existing users can simply add BE_SIMPLE_MODE="yes" to the config file, after manual upgrade, otherwise the utility will generate the file upon new installation.

What this option will do is to simplify the backup creation by automatically take an recent snapshot(taken manually in advanced mode) of the selected Boot Environment then sent it to the backup location already defined in the config file or by user defined, also it will show only current Boot Environments list instead of all BE@Snaps list in which can be very large and tedious to navigate form and select wanted BE@Snap to backup.

Additionally whit this option enabled, the utility will always take the latest BE changes before backup.
Usage is as simple as start the bemanager in TUI mode and just select a BE and click ok to save.

Sample TUI screenshot:
bemanager-TUI.png


For CLI/scripting mode, is also as simple as just execute:

Code: Select all

bemanager -b bename
or with optionally specified location with:

Code: Select all

bemanager -b bename location
were "location" is either local or remote

Sample CLI backup output:

Code: Select all

root@nas-mserver: ~# bemanager -b default
==> Trying to snapshot default Boot Environment...
Created successfully
Boot Environment default snapshot successfully!
==> Trying to backup default Boot Environment...
  100 %      303.7 MiB / 1244.0 MiB = 0.244    69 MiB/s       0:17             
Boot Environment default saved to /mnt/storage/zfsbackups/nas/default-2019-05-11-00:10:38.zfs successfully!
root@nas-mserver: ~#
Restoring from a Boot Environment is just the same steps in the TUI mode, just select the file from the list and click ok to restore.

And to restore from the CLI mode simple execute:

Code: Select all

bemanager -r filename.zfs
or with optionally specified location with:

Code: Select all

bemanager -r filename.zfs location
were "location" is either local or remote

Sample CLI restore output

Code: Select all

root@nas-mserver: ~# bemanager -r default-2019-05-11-00:10:38.zfs
==> Trying to restore default-2019-05-11-00:10:38.zfs from file...
/mnt/storage/zfsbackups/nas/default-2019-05-11-00:10:38.zfs (1/1)
  100 %      303.7 MiB / 1244.0 MiB = 0.244    39 MiB/s       0:32             
Boot Environment default-2019-05-11-00:10:38.zfs restored to zroot/ROOT/restore-2019-05-11-001720 successfully!
Please activate the restored Boot Environment now so it will take effect after reboot!
root@nas-mserver: ~#
Remember, don't forget to make new BE's when testing packages, if you prefer the easy way, there is also an BE Manager Extension for.

Regards
You do not have the required permissions to view the files attached to this post.
System: FreeBSD 12 RootOnZFS Mirror, MB: Supermicro X8SI6-F, Xeon X3450, 16GB DDR3 ECC RDIMMs.
XigmaNAS RootOnZFS
Addons at GitHub
BastilleBSD
Boot Environments Intro
Resources Home Page

Post Reply

Return to “ZFS (only!)”