Will it Openstack

Beholder101

Dabbler
Joined
Feb 21, 2016
Messages
14
My bad! Thank you very much for looking further than i did. Here i was thinking that 11.2 was STABLE, but STABLE can also mean BETA or RC. I found that out pretty early and before posting here. I never noticed the changed API.

I did some searching and found this. This is taken from the introduction text for 11.2 STABLE (read: BETA):

The rewrite from the old API to the new middlewared continues. Once the API stabilizes and the rewrite is complete, api.freenas.org will be deprecated and replaced by the new API documentation. In the mean time, to see the API documentation for the new middleware, log into the new UI, click on the URL for the FreeNAS system in your browser’s location bar, and add /api/docs to the end of that URL.

I'll take a step back to 11.1. This is not a simple switch, as 'minor release' downgrades are not supported. It does have me a little worried if Cinder driver development is going to be continued.

Thank you again for your support!
 

Chris Moore

Hall of Famer
Joined
May 2, 2015
Messages
10,080
11.1 to 11.2 is a much larger change than the numbers would indicate.
The numbers relate back to the underpinning BSD version, but the complete overhaul of the UI and middleware make it worth a full version number (my opinion) for FreeNAS.

Sent from my SAMSUNG-SGH-I537 using Tapatalk
 

Beholder101

Dabbler
Joined
Feb 21, 2016
Messages
14
I downgraded to 11.1 by doing a complete new install and importing the zpool (OpenZFS version 5000). There is a really large amount of error messages all over the GUI. Advanced settings says "Something went wrong, please try again later.", Dashboard says "Error 201:[ENOMETHOD] Method "summary" not found in "network.general"" and pool settings says "Error 201:[ENOMETHOD] Method "query" not found in "pool.dataset"". The SMB service says 'Crached', so i will need to see what happened there too.

i figured i would see if there are updates or even try the nightlies to overcome them, but Updates don't work and get stuck at updating (the whoosh keeps whooshing).

Last option would be to do a complete reinstall of everything. Openstack is too interwoven with the Freenas-served NFS shares to just break that. That is a whole other mess to clean up.

Any suggestions how to get the updates running and is it a good approach to try the nightlies?

and some more investigation... does Freenas store config data on the ZPOOL's? Because even after a complete reinstall i see config data, like users, that are not default but were created when 11.2 was running? I get the feeling that i need to completely wipe the installation.

14:19 update. After a factory reset using the command line, i was able to get rid of almost all errors. Somehow some config is preserved. I ran this command to do a reset:
cp /data/factory-v1.db /data/freenas-v1.db && reboot

14:34 update. Freenas 11.1 is working as expected and the driver too. No alteration needed, besides the creation of the Portal and Initiators config items. I have created and iSCSI volume using Openstack and migrated an existing one by re-typing it.
25884-d710621b7e5f35768688ba6cc4b2de63.jpg
 
Last edited:

konetzed

Dabbler
Joined
Aug 16, 2018
Messages
20
Beholder101

I hope the driver isn't going anywhere. As long as iXsystems doesn't make it closed source or change the api in a way that you cannot make the iscsi volumes anymore I think a few of us can figure out how to patch the driver to make it work. Until 11.2 is final though I am not going to spend any time really digging into things. I also hope iXsystems will update the driver too :D.

I would love to get the Manila driver someone wrote back onilne and running again https://forums.freenas.org/index.ph...driver-shared-file-service-for-freenas.59372/ *Fingers Crossed*

Glad to hear you got it all working! Shameless plug here, I do not work for iXsystems in any way, I do have a TrueNAS system at work and I have to say these guys have been great to work with. If this is a critical system you are working on I would say shoot them an email and maybe check out what they can do for you. :D
 

Beholder101

Dabbler
Joined
Feb 21, 2016
Messages
14
A complete reinstall of Openstack Rocky on Ubuntu 18.04 and Freenas 11.1 U6 with the iSCSI driver works like a charm, mostly due to none of the residue being left by my trial and error. I'll +1 your request for the manila driver. I can see the use of that driver once we get to a production stage, where our customers can securely store and share dumps/drivers/configs.

As for the Shameless plug... :smile: we have an iXsystems certified unit in our current production setup. Never regretted that purchase.
 

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139
I am seeing the same issue on the FreeNAS-11.2-U2.1

Code:
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server [req-65399ed2-8aa7-47b1-a99d-c046affb2561 9b78ab7a27734f24b951d8533ef43363 9e44f0ce373f43f28a923277ebcf41a7 - default default] Exception during message handling: FreeNASApiError: FREENAS api failed. Reason - Unexpected error:Error while creating relation between target and extent: 409:Conflict
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "<string>", line 2, in create_volume
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/objects/cleanable.py", line 207, in wrapper
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     result = f(*args, **kwargs)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 666, in create_volume
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     _run_flow()
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 658, in _run_flow
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     flow_engine.run()
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 339, in reraise_if_any
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 346, in reraise
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1031, in execute
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     model_update = self._create_raw_volume(volume, **volume_spec)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 998, in _create_raw_volume
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     ret = self.driver.create_volume(volume)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 96, in create_volume
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     freenas_volume['name'])
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 230, in _create_iscsitarget
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     self._target_to_extent(tgt_id, ext_id)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 106, in _target_to_extent
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server     raise FreeNASApiError('Unexpected error', msg)
2019-01-16 08:54:52.071 4154 ERROR oslo_messaging.rpc.server FreeNASApiError: FREENAS api failed. Reason - Unexpected error:Error while creating relation between target and extent: 409:Conflict

 
Last edited:

dleeshus

Cadet
Joined
Jan 28, 2019
Messages
2
I added in this one line ( params['iscsi_lunid'] = 0) in common.py from the driver and I'm able to create zvols without the 409 error. According to the API guide, this parameter is required.

freenas 11u2 / openstack pike on ubuntu 16.04

I'm able to manually attach the volumes to instances but I'm unable to get OpenStack to attach it or create a volume from an image. By manual, I mean logging into the instance itself and using iscsiadm commands to log in and attach. I probably need to look over my glance / nova / cinder configs.

When creating a volume from an image, it gets stuck at downloading.
When attaching to an instance, it errors out.

I'm also getting those "no lun is defined for target" and "pg1 is not assigned to any target" warnings on the freenas debug.log, but it seems the volumes created are functioning.

Anyways, I thought I'd pitch in since this thread got me to this point.

Code:
    def _target_to_extent(self, target_id, extent_id):
        """Create  relationship between iscsi target to iscsi extent"""

        LOG.debug('_target_to_extent target id : %s extend id : %s', target_id, extent_id)

        request_urn = ('%s/') % (FreeNASServer.REST_API_TARGET_TO_EXTENT)
        params = {}
        params['iscsi_target'] = target_id
        params['iscsi_extent'] = extent_id
        params['iscsi_lunid'] = 0

        LOG.debug('_create_target_to_extent params : %s', json.dumps(params))

        tgt_ext = self.handle.invoke_command(FreeNASServer.CREATE_COMMAND, request_urn, json.dumps(params))

        if tgt_ext['status'] != FreeNASServer.STATUS_OK:
            msg = ('Error while creating relation between target and extent: %s target_id=%s extent_id=%s params=%s' % (tgt_ext['response'], target_id, extent_id, params))
            raise FreeNASApiError('Unexpected error', msg)
 

Chris Moore

Hall of Famer
Joined
May 2, 2015
Messages
10,080
I am seeing the same issue on the latest version of FreeNAS.
It is always best to give an actual version number because the "latest" version could be different next week.
 

konetzed

Dabbler
Joined
Aug 16, 2018
Messages
20
@dleeshus stupid question but do you have the iscsi service running? I have had issues as you described and it was due to the iscsi service not running. Also if you haven't also set it to start on boot.
 

dleeshus

Cadet
Joined
Jan 28, 2019
Messages
2
I actually got it working. Apparently I left out the cinder section in the nova.conf on the compute nodes. But thanks for the hints. I'll keep that in mind if I run into the issue later.
 

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139
I added in this one line ( params['iscsi_lunid'] = 0) in common.py from the driver and I'm able to create zvols without the 409 error. According to the API guide, this parameter is required.

freenas 11u2 / openstack pike on ubuntu 16.04

I'm able to manually attach the volumes to instances but I'm unable to get OpenStack to attach it or create a volume from an image. By manual, I mean logging into the instance itself and using iscsiadm commands to log in and attach. I probably need to look over my glance / nova / cinder configs.

When creating a volume from an image, it gets stuck at downloading.
When attaching to an instance, it errors out.

I'm also getting those "no lun is defined for target" and "pg1 is not assigned to any target" warnings on the freenas debug.log, but it seems the volumes created are functioning.

Anyways, I thought I'd pitch in since this thread got me to this point.

Code:
    def _target_to_extent(self, target_id, extent_id):
        """Create  relationship between iscsi target to iscsi extent"""

        LOG.debug('_target_to_extent target id : %s extend id : %s', target_id, extent_id)

        request_urn = ('%s/') % (FreeNASServer.REST_API_TARGET_TO_EXTENT)
        params = {}
        params['iscsi_target'] = target_id
        params['iscsi_extent'] = extent_id
        params['iscsi_lunid'] = 0

        LOG.debug('_create_target_to_extent params : %s', json.dumps(params))

        tgt_ext = self.handle.invoke_command(FreeNASServer.CREATE_COMMAND, request_urn, json.dumps(params))

        if tgt_ext['status'] != FreeNASServer.STATUS_OK:
            msg = ('Error while creating relation between target and extent: %s target_id=%s extent_id=%s params=%s' % (tgt_ext['response'], target_id, extent_id, params))
            raise FreeNASApiError('Unexpected error', msg)

We are surely getting closer, Now I get a response error
Code:
Error starting thread.: JSONDecodeError: Extra data: line 1 column 4 - line 1 column 26 (char 3 - 25)
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service Traceback (most recent call last):
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/oslo_service/service.py", line 796, in run_service
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     service.start()
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/cinder/service.py", line 222, in start
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     service_id=Service.service_id)
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 443, in init_host
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     self.driver.init_capabilities()
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 728, in init_capabilities
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     stats = self.get_volume_stats(True)
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 198, in get_volume_stats
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     self.stats = self.common._update_volume_stats()
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 351, in _update_volume_stats
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     data['total_capacity_gb'] = ix_utils.get_size_in_gb(json.loads(ret['response'])['avail'] + json.loads(ret['response'])['used'])
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib64/python2.7/site-packages/simplejson/__init__.py", line 516, in loads
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     return _default_decoder.decode(s)
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service   File "/usr/lib64/python2.7/site-packages/simplejson/decoder.py", line 377, in decode
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service     raise JSONDecodeError("Extra data", s, end, len(s))
2019-03-17 11:15:46.820 24739 ERROR oslo_service.service JSONDecodeError: Extra data: line 1 column 4 - line 1 column 26 (char 3 - 25)
 

senthil_p

Cadet
Joined
Oct 22, 2019
Messages
6
Hi
Thanks for the post its was working perfectly.
We have configured with cinder backup and tested working when am checking with cidner incremental backup I found the first time I can take the backup of the instance which is in use. When I am trying second-time backup getting failed due to FreeNAS api failed unknown reason.

Cinder volume log :

2019-10-22 15:30:30.508 21508 INFO cinder.volume.manager [req-e7f4a09b-4d63-4296-8407-b4bd4ef4a2ac 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] Deleted volume successfully.
2019-10-22 15:30:52.826 21508 INFO cinder.volume.drivers.ixsystems.iscsi [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] iXsystems Create Colened Volume
2019-10-22 15:30:52.826 21508 INFO cinder.volume.drivers.ixsystems.iscsi [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] create_cloned_volume: 0af94e57-0d5b-499b-8aac-89688895bac8
2019-10-22 15:30:52.827 21508 INFO cinder.volume.drivers.ixsystems.iscsi [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] iXsystems Create Snapshot
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] Exception during message handling: FreeNASApiError: FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4332, in get_backup_device
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server self.driver.get_backup_device(ctxt, backup))
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1144, in get_backup_device
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server context, backup)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1185, in _get_backup_volume_temp_volume
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server context, volume)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1291, in _create_temp_cloned_volume
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server temp_vol_ref.destroy()
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1286, in _create_temp_cloned_volume
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server model_update = self.create_cloned_volume(temp_vol_ref, volume)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 212, in create_cloned_volume
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server self.create_snapshot(temp_snapshot)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 167, in create_snapshot
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server self.common._create_snapshot(freenas_snapshot['name'], freenas_volume['name'])
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 299, in _create_snapshot
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server raise FreeNASApiError('Unexpected error', e)
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server FreeNASApiError: FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request
2019-10-22 15:30:53.087 21508 ERROR oslo_messaging.rpc.server

Cinder Backup LOG:


2019-10-22 15:30:52.635 390 INFO cinder.backup.manager [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] Create backup started, backup: d93fd9c2-d67f-4f9e-a428-3eef67b230dd volume: 0af94e57-0d5b-499b-8aac-89688895bac8.
2019-10-22 15:30:52.670 390 INFO os_brick.remotefs.remotefs [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] Already mounted: /var/lib/cinder/backup_nfs/80c4fc8dae0f4784c10ebc068f570056
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server [req-40f63700-562b-4cca-85d1-4b983a09b7bb 6f2284200926497c87d1b50f3e9f0194 6d806e7e0b0642ba8d46f867ab6622c0 - default default] Exception during message handling: RemoteError: Remote error: FreeNASApiError FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request
[u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4332, in get_backup_device\n self.driver.get_backup_device(ctxt, backup))\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1144, in get_backup_device\n context, backup)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1185, in _get_backup_volume_temp_volume\n context, volume)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1291, in _create_temp_cloned_volume\n temp_vol_ref.destroy()\n', u' File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__\n self.force_reraise()\n', u' File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise\n six.reraise(self.type_, self.value, self.tb)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1286, in _create_temp_cloned_volume\n model_update = self.create_cloned_volume(temp_vol_ref, volume)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 212, in create_cloned_volume\n self.create_snapshot(temp_snapshot)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 167, in create_snapshot\n self.common._create_snapshot(freenas_snapshot[\'name\'], freenas_volume[\'name\'])\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 299, in _create_snapshot\n raise FreeNASApiError(\'Unexpected error\', e)\n', u'FreeNASApiError: FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request\n'].
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 413, in create_backup
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server self._update_backup_error(backup, six.text_type(err))
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 402, in create_backup
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server updates = self._run_backup(context, backup, volume)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/backup/manager.py", line 468, in _run_backup
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server volume)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/rpcapi.py", line 339, in get_backup_device
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server want_objects=True)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/client.py", line 174, in call
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server retry=self.retry)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/transport.py", line 131, in _send
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server timeout=timeout, retry=retry)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in send
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server retry=retry)
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 550, in _send
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server raise result
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server RemoteError: Remote error: FreeNASApiError FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server [u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4332, in get_backup_device\n self.driver.get_backup_device(ctxt, backup))\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1144, in get_backup_device\n context, backup)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1185, in _get_backup_volume_temp_volume\n context, volume)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1291, in _create_temp_cloned_volume\n temp_vol_ref.destroy()\n', u' File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__\n self.force_reraise()\n', u' File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise\n six.reraise(self.type_, self.value, self.tb)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/driver.py", line 1286, in _create_temp_cloned_volume\n model_update = self.create_cloned_volume(temp_vol_ref, volume)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 212, in create_cloned_volume\n self.create_snapshot(temp_snapshot)\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/iscsi.py", line 167, in create_snapshot\n self.common._create_snapshot(freenas_snapshot[\'name\'], freenas_volume[\'name\'])\n', u' File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/ixsystems/common.py", line 299, in _create_snapshot\n raise FreeNASApiError(\'Unexpected error\', e)\n', u'FreeNASApiError: FREENAS api failed. Reason - Unexpected error:FREENAS api failed. Reason - Unexpected error:Error while creating snapshot: 400:Bad Request\n'].
2019-10-22 15:30:53.140 390 ERROR oslo_messaging.rpc.server
 

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139
Cinder backups are really an entirely separate service. I am not sure this driver will support backups.

Also versions of openstack + freenas are helpful

What errors are thrown on the freenas side?

If it was me, I would use NFS for backups as what you really want are the disk images from the cinder volumes. If they are locked up in an iscsi volume they will be less easy to work with. Just my method and my opinion.

I would love to know how many Openstack users there are out there?
 
Last edited:

senthil_p

Cadet
Joined
Oct 22, 2019
Messages
6
Openstack Queen Version: 13.0.2
FreeNAS-11.2-U6
Cinder Backend used NFS from FreeNAS

Config cinder.conf

backup_driver = cinder.backup.drivers.nfs
#backup_mount_options="vers=3"
backup_mount_point_base = $state_path/backup_nfs
backup_share = 20.40.4.7:/mnt/Virtual-Disk-1/cinder
nas_secure_file_operations=false

I am new for FreeNAS storage .. May I know how to view the logs?
 

hbonath

Cadet
Joined
Jan 29, 2020
Messages
3
Cinder backups are really an entirely separate service. I am not sure this driver will support backups.

Also versions of openstack + freenas are helpful

What errors are thrown on the freenas side?

If it was me, I would use NFS for backups as what you really want are the disk images from the cinder volumes. If they are locked up in an iscsi volume they will be less easy to work with. Just my method and my opinion.

I would love to know how many Openstack users there are out there?

Hello, I just wanted to jump in the thread here as I am using FreeNAS 11.1-U7 with Openstack Train 20.0.1.
There must be dozens of us! :cool:

I had been using Stein, but upon upgrading to Train and Python3, everything broke.
I've done a bit of work on making the driver Py3 Compatible and wanted to share that:

https://gitlab.com/hbonath/ixsystems-freenas-cinder

This has not been tested whatsoever with Python2 and I would assume is not backward compatible, I simply wanted to keep this alive as Python2 is now EOL and Openstack Train and beyond will be Python3.

To chime in on the backup discussion - I have the cinder backup service configured to back up full volume copies into Swift, which can be restored onto new LUNs if need-be.
 

konetzed

Dabbler
Joined
Aug 16, 2018
Messages
20
I know this has been pretty much only cinder support but does anyone have interest in a FreeNAS OpenStack Manila plugin?
 

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139
It would be really great if we could get support for image / block / object / and shared filesystem for Openstack... Any of those is fantastic to have. Openstack is very mature at this point, and Freenas would make a solid backend for many of these storage types.
 

Donny Davis

Contributor
Joined
Jul 31, 2015
Messages
139
Hello, I just wanted to jump in the thread here as I am using FreeNAS 11.1-U7 with Openstack Train 20.0.1.
There must be dozens of us! :cool:

I had been using Stein, but upon upgrading to Train and Python3, everything broke.
I've done a bit of work on making the driver Py3 Compatible and wanted to share that:

https://gitlab.com/hbonath/ixsystems-freenas-cinder

This has not been tested whatsoever with Python2 and I would assume is not backward compatible, I simply wanted to keep this alive as Python2 is now EOL and Openstack Train and beyond will be Python3.

To chime in on the backup discussion - I have the cinder backup service configured to back up full volume copies into Swift, which can be restored onto new LUNs if need-be.

Have you thought about asking the ixsystems maintainer for a train branch. It would be great to get it in the upstream ixsystems repo
 
Top